Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Logstash Pipeline : input jdbc - Must only be called before \"cook()\" | [BUG] #464

Open
rehantoday opened this issue Dec 19, 2022 · 1 comment
Labels
bug Something isn't working

Comments

@rehantoday
Copy link

Environment

  • DocumentDB JDBC driver version: 1.4.2
  • DocumentDB server version: MongoDB 4.0 Compatibility
  • OS: Ubuntu 20.04
  • BI Tool or client name: Logstash
  • BI Tool or client version:
  • Java version (if known): openjdk 11.0.17

Problem Description

  1. Steps to reproduce:

I am trying to run logstash pipeline which requires to get input (data from DocumentDB 'Mongo') and output to elasicSearch.

input {
    jdbc{
        jdbc_driver_library => ""
        jdbc_driver_class => "software.amazon.documentdb.jdbc.DocumentDbMain"
        jdbc_connection_string => "jdbc:documentdb://user:[email protected]/testDB?tlsAllowInvalidHostnames=true&tlsCAFile=rds-combined-ca-bundle.pem>
        jdbc_user => ""
        jdbc_password => ""
        statement => "select * FROM `products` where `id` = 10"
    }
}
filter {}
output{
  stdout { codec => rubydebug { metadata=> true } }
  elasticsearch {
    action => "update"
    doc_as_upsert => true
    index => "logstash-products-index"
    hosts => ["localhost:9200"]
    user => ""
    password => ""
  }
}
  1. Expected behaviour:

Get product from DocumentDB and add to ElasticSearch.

  1. Actual behaviour:

When the pipeline is run, it fails and throws error.

  1. Error message/stack trace:

[ERROR] 2022-12-19 14:52:25.217 [Ruby-0-Thread-17: :1] javapipeline - Worker loop initialization error {:pipeline_id=>"main", :error=>"Must only be called before "cook()"", :exception=>Java::JavaLang::IllegalStateException, :stacktrace=>"org.codehaus.janino.SimpleCompiler.assertUncooked(org/codehaus/janino/SimpleCompiler.java:526)\norg.codehaus.janino.SimpleCompiler.cook(org/codehaus/janino/SimpleCompiler.java:252)\norg.codehaus.janino.SimpleCompiler.compileToClassLoader(org/codehaus/janino/SimpleCompiler.java:517)\norg.codehaus.janino.SimpleCompiler.cook(org/codehaus/janino/SimpleCompiler.java:241)\norg.codehaus.janino.SimpleCompiler.cook(org/codehaus/janino/SimpleCompiler.java:219)\norg.codehaus.commons.compiler.Cookable.cook(org/codehaus/commons/compiler/Cookable.java:79)\norg.codehaus.commons.compiler.Cookable.cook(org/codehaus/commons/compiler/Cookable.java:74)\norg.logstash.config.ir.compiler.ComputeStepSyntaxElement.lambda$compile$0(org/logstash/config/ir/compiler/ComputeStepSyntaxElement.java:150)\njava.util.concurrent.ConcurrentHashMap.computeIfAbsent(java/util/concurrent/ConcurrentHashMap.java:1708)\norg.logstash.config.ir.compiler.ComputeStepSyntaxElement.compile(org/logstash/config/ir/compiler/ComputeStepSyntaxElement.java:140)\norg.logstash.config.ir.compiler.ComputeStepSyntaxElement.instantiate(org/logstash/config/ir/compiler/ComputeStepSyntaxElement.java:124)\norg.logstash.config.ir.compiler.DatasetCompiler.terminalOutputDataset(org/logstash/config/ir/compiler/DatasetCompiler.java:212)\norg.logstash.config.ir.CompiledPipeline$CompiledExecution.compileOutputs(org/logstash/config/ir/CompiledPipeline.java:414)\norg.logstash.config.ir.CompiledPipeline$CompiledExecution.(org/logstash/config/ir/CompiledPipeline.java:379)\norg.logstash.config.ir.CompiledPipeline$CompiledUnorderedExecution.(org/logstash/config/ir/CompiledPipeline.java:337)\norg.logstash.config.ir.CompiledPipeline.buildExecution(org/logstash/config/ir/CompiledPipeline.java:156)\norg.logstash.execution.WorkerLoop.(org/logstash/execution/WorkerLoop.java:67)\njdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)\njdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(jdk/internal/reflect/NativeConstructorAccessorImpl.java:77)\njdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(jdk/internal/reflect/DelegatingConstructorAccessorImpl.java:45)\njava.lang.reflect.Constructor.newInstanceWithCaller(java/lang/reflect/Constructor.java:499)\njava.lang.reflect.Constructor.newInstance(java/lang/reflect/Constructor.java:480)\norg.jruby.javasupport.JavaConstructor.newInstanceDirect(org/jruby/javasupport/JavaConstructor.java:237)\norg.jruby.RubyClass.new(org/jruby/RubyClass.java:911)\norg.jruby.RubyClass$INVOKER$i$newInstance.call(org/jruby/RubyClass$INVOKER$i$newInstance.gen)\nusr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.init_worker_loop(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:580)\nusr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_workers(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:286)\norg.jruby.RubyProc.call(org/jruby/RubyProc.java:309)\njava.lang.Thread.run(java/lang/Thread.java:833)", :thread=>"#<Thread:0x2c74eb8e sleep>"}

  1. Any other details that can be helpful:
    N/A

Screenshots

N/A


JDBC log

@rehantoday rehantoday added the bug Something isn't working label Dec 19, 2022
@birschick-bq
Copy link
Contributor

birschick-bq commented Dec 19, 2022

@rehantoday
I'm not that familiar with Logstash.

However, just a quick check. I noticed you using the jdbc_driver_class => "software.amazon.documentdb.jdbc.DocumentDbMain".

The value I believe you need here is:

  1. jdbc_driver_class => software.amazon.documentdb.jdbc.DocumentDbDriver.
  2. As well, you may need to set the jdbc_driver_library => ".../documentdb-jdbc-1.4.2-all.jar"

Are you running this within the same VPC as the DocumentDB server? (xxxxxxxxxxxxxxxxxxxxx.docdb.amazonaws.com) - Otherwise, you will also need to worry about connecting through an SSH tunnel.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants