0

I'm currently in the early stages of setting up some Kibana dashboards using the ELK stack and a MySQL database. According to the logstash config check utility, I have a valid .conf file, but nothing is showing up in elastic.

First off, my DB is populated:

mysql> SELECT COUNT(session_id) AS session_id FROM scans;
+------------+
| session_id |
+------------+
|          6 |
+------------+
1 row in set (0.00 sec)

And here is my logstash conf file:

input {
        jdbc {
           jdbc_connection_string => "jdbc:mysql://localhost:3306/dashboarddb"
           jdbc_user => "user"
           jdbc_password => "password"
           jdbc_driver_library => "/home/ubuntu/mysql-connector-java-8.0.16/mysql-connector-java-8.0.16.jar"
           jdbc_driver_class => "com.mysql.jdbc.Driver"
           statement => "SELECT * FROM scans;"
         }
}

output {
         elasticsearch {
             hosts => "localhost:9200"
             index => "scans"
             document_id => "%{session_id}"
       }

}

When I start logstash:

[2019-06-07T19:30:45,740][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.1.1"}
[2019-06-07T19:30:51,727][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-06-07T19:30:51,924][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-06-07T19:30:51,973][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-06-07T19:30:51,976][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2019-06-07T19:30:52,002][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-06-07T19:30:52,022][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-06-07T19:30:52,029][INFO ][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x3cc19c34 run>"}
[2019-06-07T19:30:52,148][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-06-07T19:30:52,237][INFO ][logstash.javapipeline    ] Pipeline started {"pipeline.id"=>"main"}
[2019-06-07T19:30:52,344][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-06-07T19:30:52,808][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2019-06-07T19:30:53,742][INFO ][logstash.inputs.jdbc     ] (0.018699s) SELECT * FROM scans;
[2019-06-07T19:30:55,542][INFO ][logstash.runner          ] Logstash shut down.

And when I check elastic:

/var/log/logstash$ curl -H "Content-Type: application/json" -XGET '127.0.0.1:9200/scans/_search?q=Something&pretty'
{
  "took" : 826,
  "timed_out" : false,
  "_shards" : {
    "total" : 1,
    "successful" : 1,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : {
      "value" : 0,
      "relation" : "eq"
    },
    "max_score" : null,
    "hits" : [ ]
  }
}

So really not sure where to go from here -- any help would be appreciated!

2
  • please increase the jdbc input plugin logging (debug) and let's see what is happening there Commented Jun 8, 2019 at 20:34
  • I rebooted my machine and everything worked fine -- not sure what was going on! Commented Jun 10, 2019 at 17:58

1 Answer 1

1

As you can see from the provided log output, logstash shuts down immediately after the pipeline attempts to perform the sql query. This explains why the data is not sent to elasticsearch. As ibexit suggested, please enable the debug level in your logging and post the output of the very same execution steps.

Sign up to request clarification or add additional context in comments.

1 Comment

Everything worked great after I rebooted my machine -- maybe things just needed to be started in sequence.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.