I havee 3 tables on mysql.
And indexed the data to elasticsearch.
It worked with below.(logstash config)
input {
jdbc {
jdbc_driver_library => "lib/mysql-connector-java-5.1.33.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/test"
jdbc_user => "test"
jdbc_password => "test"
statement => "SELECT * FROM TABLE1"
schedule => "* * * * *"
type => "table1"
}
//
// two more inputs
//
}
output {
stdout {codec => rubydebug}
if [type] == "table1" {
elasticsearch {
hosts => ["localhost:9200"]
index => "test"
document_type => "%{type}"
document_id => "%{id}"
template => "./template"
template_overwrite => true
}
}
//
// two more outputs
//
}
I deleted other two inputs and outputs, some rows from TABLE1 on mysql.
And started logstash.
It was not updated.
So, typed
curl -XDELETE 'http://localhost:9200/*'
and restared logstash again.
But three tables were added again with same rows on elasticsearch.
How to update elasticsearch data automatically?