Comments (9)
Can you double check if its related to #55
?
Do you have time_file and/or size_file configured?
from logstash-output-s3.
I have time_file set to 1 minute.
There are no errors what so ever in the logs, so i can't tell you exactly what happens, but i can tell you for sure that the plugin continues to work, but is not uploading.
The plugin keeps writing events to the temp file, but never uploads them.
I think that maybe if the connection to S3 is broken in the middle of the script running, than it won't reconnect and you don't catch that error.
from logstash-output-s3.
I am seeing the same behavior, happens with both time_file and size_file.
I have tried each individually and also together (small file size and longer time) just to see if the size didn't catch it the time one would, but still having the issue
from logstash-output-s3.
So it looks like the problem is with how you are setting up the worker threads. They don't stay alive after they process an upload. I tested this by setting the number of upload workers, and every time I get that number of files and then things start queuing on disk.
logstash 2.1.1
logstash-output-s3-2.0.3
from logstash-output-s3.
We seam to have the exact problem reported by zot420.
Is there any advice here? At least a work around?
Thanks!
from logstash-output-s3.
Is this issue resolved ??
from logstash-output-s3.
Unfortunately i have no idea about the bug and it's resolution. However a quick work around for me was to restart logstash in case the folder start filling up more than expected.
I have nothing to share as we are currently migrating to another solution, therefor did no automation script (aside from standard disk monitoring) to manage that.
Hope this help at least a little.
from logstash-output-s3.
My team is also running into this bug consistently. I don't think it is related to workers though - we have multiple identical machines, but one of them stopped uploading at 998 files, and the other one is at 1400 files but is still working correctly. My guess is that the S3 connection returns some sort of unexpected error, and the plugin doesn't handle it correctly. It must be the S3 upload part - the file rotation still works as expected, just that the file doesn't get uploaded.
Restarting logstash does work, but is obviously not a good solution. We will likely look into alternatives until this is fixed.
from logstash-output-s3.
I am testing logstash under docker with this plugin under localstack.
I ran a test case and found there are files under /tmp/logstash, but they are not uploaded to s3.
When I restart logstash (kill -TERM) I see the tmplogstash is now empty; then I checked the s3 bucket and indeed there are docs there. The /tmp/logstash is still empty. It seems like this plugin has some state/buffering/flushing problems.
Here is my output config file:
output {
s3 {
endpoint => "http://localstack:4566"
access_key_id => "test"
secret_access_key => "test"
additional_settings => { "force_path_style" => true }
validate_credentials_on_root_bucket => false
region => "us-east-1"
bucket => "em-top-archive-us-east-1-local-localstack"
codec => "json"
canned_acl => "private"
prefix => "year=%{[@metadata][index_year]}/month=%{[@metadata][index_month]}/day=%{[@metadata][index_day]}/type=%{[@metadata][index]}/project=%{[project.name]}/environment=%{[ecsenv]}"
}
}
from logstash-output-s3.
Related Issues (20)
- Logstash-client failed uploading to S3 HOT 1
- During initializing : syntax error, unexpected tLABEL HOT 1
- S3 Storage Tiers Missing
- Please update to AWS SDK v3 HOT 4
- Add trusted CA cert setting
- Too many open files in syste HOT 1
- Impact of LetsEncrypt certificate expiry HOT 5
- Logstash sometimes ignore the Prefix and creates UUID folders
- After the file is uploaded to s3, the content is wrong, is it an encoding problem? HOT 2
- [Docs] Document workaround when using private link endpoints with us-east-1
- Filename option HOT 2
- Allow the use of the Intelligent-Tiering storage class
- Seems GzipUtil.compress output a broken gzip file sometimes
- May I know in which version of Logstash was the #249 issue fixed? HOT 1
- Is there a way to count the number of events packaged into an s3 object?
- Add zstd support
- validate_credentials_on_root_bucket is ignored
- Gzip compression not working HOT 1
- Plugin S3 output- (EMFILE) Too many open files
- Logstash S3 Output plugin requires both access_key_id and secret_access_key (if both not provided throws errors) HOT 11
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from logstash-output-s3.