1

I have a million rows in a table that I'd like to import to elastic search? How to do that?

2

2 Answers 2

1
  1. Export data to json : https://hashrocket.com/blog/posts/create-quick-json-data-dumps-from-postgresql
  2. Import data from json file : curl -XPUT localhost:9200/_bulk --data-binary @shakespeare.json
Sign up to request clarification or add additional context in comments.

Comments

0

You can use logstash for this where a sample config file to ship data from POSTGRES to elasticsearch would like below. You can follow this link for a detailed tutorial

input {
  jdbc {
  jdbc_connection_string => "jdbc:postgresql://:5432/"
  jdbc_user => "<my-username>"
  jdbc_password => "<my-password>"
  jdbc_driver_library => "///postgresql-42.1.4.jar"
  jdbc_driver_class => "org.postgresql.Driver"
  statement => "SELECT * from contact"
  }
}
output {
  elasticsearch {
  index => "contact"
  hosts => ["localhost:9200"]
  }
}

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.