DEV Community

Cover image for How to solve Big log capture errors in Fluentd.
yoshidaagri
yoshidaagri

Posted on

How to solve Big log capture errors in Fluentd.

Summary:

  • This memo is when building a LogMonitoringSystem with Fluentd.

Construction:

  • See the figure above.

error message of my Fluentd:

2019-08-30 09:44:20 +0900 [info]: #0 bulk insert values size (table: access_log) => 37207
2019-08-30 09:44:20 +0900 [warn]: #0 failed to flush the buffer. 
retry_time=1 
next_retry_seconds=2019-08-30 09:44:21 +0900 chunk="5914aebbe44b1ad68c3f180a99706368" error_class=Mysql2::Error::ConnectionError 
error="MySQL server has gone away"
  • ...gone away?

Cause of the error:

  • The cause of the error is my Mysql setting.
  • Seting name is max_allowed_packet.defualt setting 4M.
  • I was using the standard settings.However, the log file to capture is 9M...

Correspondence:

  • Fixed mysql settings.
    • before: max_allowed_packet:4M
    • after : max_allowed_packet:16M

result:

2019-08-30 16:10:29 +0900 [info]: #0 detected rotation of /var/log/nginx/access.log; waiting 5 seconds
2019-08-30 16:10:35 +0900 [info]: #0 following tail of /var/log/nginx/access.log
2019-08-30 16:10:39 +0900 [info]: #0 bulk insert values size (table: access_log) => 37207
2019-08-30 16:10:47 +0900 [info]: #0 bulk insert values size (table: access_log) => 13854

Great!!

Thenk you!

Oldest comments (0)