I have a million rows in a table that I'd like to import to elastic search? How to do that?
2 Answers
- Export data to json : https://hashrocket.com/blog/posts/create-quick-json-data-dumps-from-postgresql
- Import data from json file : curl -XPUT localhost:9200/_bulk --data-binary @shakespeare.json
Comments
You can use logstash for this where a sample config file to ship data from POSTGRES to elasticsearch would like below. You can follow this link for a detailed tutorial
input {
jdbc {
jdbc_connection_string => "jdbc:postgresql://:5432/"
jdbc_user => "<my-username>"
jdbc_password => "<my-password>"
jdbc_driver_library => "///postgresql-42.1.4.jar"
jdbc_driver_class => "org.postgresql.Driver"
statement => "SELECT * from contact"
}
}
output {
elasticsearch {
index => "contact"
hosts => ["localhost:9200"]
}
}