-
Notifications
You must be signed in to change notification settings - Fork 187
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Improve the way the jdbc input handle the connection failure #144
Comments
This is confirmed with SQL Server on Logstash startup, however I was not able to break a running connection by changing a password or restarting the server - it seems this problem hits the initial connection only. Failed TCP Connection:
Password must be changed:
|
does that mean the connection is recovered automatically or is dead until you restart logstash? |
@hummingV - I found that the connection would recover automatically in that case. Test Conditions:
Reported Failure:
|
This is a very awful bug. Whole Logstash crash when jdbc input can´t connect.
|
It looks like underlying sequel library handles the connection pool pretty well. Once logstash is started, db connection issues won't crash logstash and tend to automatically recover in the next schedule run (so long as the said connection issue is resolved by then). IMO,
|
@ruria, re: 2. Logstash can´t connect at any point in time --> whole logstash crash |
I´m working with SQL Server (sqljdbc_6.0), you can see the output just bellow: {:timestamp=>"2016-12-10T18:59:16.279000+0100", :message=>"Pipeline aborted due to error", :exception=>#<Sequel::DatabaseConnectionError: Java::ComMicrosoftSqlserverJdbc::SQLServerException: The TCP/IP connection to the host vzz04, port 51433 has failed. Error: "Connection timed out: no further information. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.".>, :backtrace=>["com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDriverError(com/microsoft/sqlserver/jdbc/SQLServerException.java:191)", "com.microsoft.sqlserver.jdbc.SQLServerException.ConvertConnectExceptionToSQLServerException(com/microsoft/sqlserver/jdbc/SQLServerException.java:242)", "com.microsoft.sqlserver.jdbc.SocketFinder.findSocket(com/microsoft/sqlserver/jdbc/IOBuffer.java:2369)", "com.microsoft.sqlserver.jdbc.TDSChannel.open(com/microsoft/sqlserver/jdbc/IOBuffer.java:551)", "com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(com/microsoft/sqlserver/jdbc/SQLServerConnection.java:1962)", "com.microsoft.sqlserver.jdbc.SQLServerConnection.login(com/microsoft/sqlserver/jdbc/SQLServerConnection.java:1627)", "com.microsoft.sqlserver.jdbc.SQLServerConnection.connectInternal(com/microsoft/sqlserver/jdbc/SQLServerConnection.java:1458)", "com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(com/microsoft/sqlserver/jdbc/SQLServerConnection.java:772)", "com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(com/microsoft/sqlserver/jdbc/SQLServerDriver.java:1168)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:483)", "RUBY.connect(c:/logstash/packages/logstash_ssl_support/vendor/bundle/jruby/1.9/gems/sequel-4.33.0/lib/sequel/adapters/jdbc.rb:222)", "RUBY.make_new(c:/logstash/packages/logstash_ssl_support/vendor/bundle/jruby/1.9/gems/sequel-4.33.0/lib/sequel/connection_pool.rb:103)", "RUBY.make_new(c:/logstash/packages/logstash_ssl_support/vendor/bundle/jruby/1.9/gems/sequel-4.33.0/lib/sequel/connection_pool/threaded.rb:224)", "RUBY.available(c:/logstash/packages/logstash_ssl_support/vendor/bundle/jruby/1.9/gems/sequel-4.33.0/lib/sequel/connection_pool/threaded.rb:197)", "RUBY._acquire(c:/logstash/packages/logstash_ssl_support/vendor/bundle/jruby/1.9/gems/sequel-4.33.0/lib/sequel/connection_pool/threaded.rb:133)", "RUBY.acquire(c:/logstash/packages/logstash_ssl_support/vendor/bundle/jruby/1.9/gems/sequel-4.33.0/lib/sequel/connection_pool/threaded.rb:147)", "RUBY.sync(c:/logstash/packages/logstash_ssl_support/vendor/bundle/jruby/1.9/gems/sequel-4.33.0/lib/sequel/connection_pool/threaded.rb:265)", "org.jruby.ext.thread.Mutex.synchronize(org/jruby/ext/thread/Mutex.java:149)", "RUBY.sync(c:/logstash/packages/logstash_ssl_support/vendor/bundle/jruby/1.9/gems/sequel-4.33.0/lib/sequel/connection_pool/threaded.rb:265)", "RUBY.acquire(c:/logstash/packages/logstash_ssl_support/vendor/bundle/jruby/1.9/gems/sequel-4.33.0/lib/sequel/connection_pool/threaded.rb:146)", "RUBY.hold(c:/logstash/packages/logstash_ssl_support/vendor/bundle/jruby/1.9/gems/sequel-4.33.0/lib/sequel/connection_pool/threaded.rb:104)", "RUBY.synchronize(c:/logstash/packages/logstash_ssl_support/vendor/bundle/jruby/1.9/gems/sequel-4.33.0/lib/sequel/database/connecting.rb:256)", "RUBY.test_connection(c:/logstash/packages/logstash_ssl_support/vendor/bundle/jruby/1.9/gems/sequel-4.33.0/lib/sequel/database/connecting.rb:266)", "RUBY.prepare_jdbc_connection(c:/logstash/packages/logstash_ssl_support/vendor/bundle/jruby/1.9/gems/logstash-input-jdbc-3.0.2/lib/logstash/plugin_mixins/jdbc.rb:170)", "RUBY.register(c:/logstash/packages/logstash_ssl_support/vendor/bundle/jruby/1.9/gems/logstash-input-jdbc-3.0.2/lib/logstash/inputs/jdbc.rb:167)", "RUBY.start_inputs(c:/logstash/packages/logstash_ssl_support/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.2-java/lib/logstash/pipeline.rb:330)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1613)", "RUBY.start_inputs(c:/logstash/packages/logstash_ssl_support/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.2-java/lib/logstash/pipeline.rb:329)", "RUBY.start_workers(c:/logstash/packages/logstash_ssl_support/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.2-java/lib/logstash/pipeline.rb:180)", "RUBY.run(c:/logstash/packages/logstash_ssl_support/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.2-java/lib/logstash/pipeline.rb:136)", "RUBY.start_pipeline(c:/logstash/packages/logstash_ssl_support/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.2-java/lib/logstash/agent.rb:465)", "java.lang.Thread.run(java/lang/Thread.java:745)"], :level=>:error} |
Hi, I have experienced a similar issue with a scheduled jdbc job and I think it may be related to this. My setup is Logstash 2.4.0 + SQL Server 2008 R2. The job is a nightly job that executes a simple query against SQL server and stores the data in elasticsearch. The problem that I've encountered twice now in the last month is that whenever there is something going on with the SQL server during the day, logstash fails to connect to it later on and the following message is logged "Java::ComMicrosoftSqlserverJdbc::SQLServerException: Connection reset by peer: socket write error". The first time it happened, it was a SQL server restart that happened that day and yesterday it happened after a networking component between those two servers was replaced. Just to avoid confusion, when the job was scheduled to execute (midnight), connectivity was not an issue. So this happens after any connectivity problems happened earlier that day. It seems that Logstash or the jdbc driver underneath tries to reuse an existing (by then broken) connection instead of establishing a new one. It continues to run, however, unlike what @ruria and @ESamir experienced. If you feel this belongs into its own (or another ticket), please let me know and I'll post it separately. Thanks! |
Closing this issue as fixed with 4.3.0. 4.3.0 will open and close connection for each query rather then attempting to hold connections open for long periods of time. Connection issues should be isolated to the given query on that schedule. Errors should be handled gracefully, and future scheduled runs should start from a clean state w/r/t t the connection.
Please open a new issue if this does not resolve the issues. |
When jdbc connection fails because the database host is not reachable the whole logstash process crash.
We should just log the exception and keep the logstash running to make sure it does not affect other input and try to connect to database until it become reachable again
The text was updated successfully, but these errors were encountered: