-
Notifications
You must be signed in to change notification settings - Fork 101
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Logstash lost data during log rotate #214
Comments
I got the same issue, any ideas? |
@lrbsunday I have test with input path as "/tmp/log/test" and "/tmp/log/test*". And data loss is the same.
I have add a flag(:rotate_flag) in "watced_file.rb" to avoid data loss. But i am not sure whether my change will bring other issues. Maybe you can give me some advice.
|
The temporary work around is to use |
Thanks for your reply. I have test with start_position => "beginning" and it works. But as you said, it is a temporary work around. Maybe we need a fix. |
I have use logstash-input-file(4.1.4) to ingest from file. I found data loss during log rotation.
I have set my 3 files to rotate. And when the file over 1k the file rotate happen.
My configuration of log rotate:
{
missingok
size 1k
notifempty
sharedscripts
rotate 3
}
My script to generate log and rotate:
for (( i=1 ; i <= 100000; i++ ))
do
echo "$i this is a bunch of test data blah blah" >> /tmp/log/test
if ! ((i % 1000)); then
sleep 1
fi
if ! ((i % 30000 || i == 100000)); then
/usr/sbin/logrotate -f /etc/logrotate.d/test &
fi
done
My configuration of logstash:
input {
file {
path => "/tmp/log/test*"
}
}
output {
file {
path => "/tmp/output.txt"
codec => line { format => "custom format: %{message}" }
}
}
Data loss happened as below:
I found that creating new "log" file caused data loss. I have checked the source code and found that new "log" file lost some logs in the beginning. (create_initial.rb seek operation cause this issue).It means that logs that written to the "log" files during the file rotation will lost.
Please give me some advice on this issue.
Thanks,
Tsukiand
The text was updated successfully, but these errors were encountered: