3

I uploaded CSV-data into elasticsearch using the machine-learning approach described here.

This created an index and a pipeline with a csv - preprocessor. The import was successful.

What is the corresponding curl command line to upload CSV data into elasticsearch, assuming the index is called iislog and the pipeline iislog-pipeline?

0

2 Answers 2

3

The csv ingest processor will only work on a JSON document that contains a field with CSV data. You cannot throw raw CSV data at it using curl.

The CSV to JSON transformation happens in Kibana (when you drop the raw CSV file in the browser window) and only then Kibana will send JSON-ified CSV.

If your CSV looks like this:

column1,column2,column3
1,2,3
4,5,6
7,8,9

Kibana will transform each line into

{"message": "1,2,3"}
{"message": "4,5,6"}
{"message": "7,8,9"}

And then Kibana will send each of those raw CSV/JSON documents to your iislog index through the iislog-pipeline ingest pipeline. The pipeline looks like this:

{
    "description" : "Ingest pipeline created by file structure finder",
    "processors" : [
      {
        "csv" : {
          "field" : "message",
          "target_fields" : [
            "column1",
            "column2",
            "column3"
          ],
          "ignore_missing" : false
        }
      },
      {
        "remove" : {
          "field" : "message"
        }
      }
    ]
}

In the end, the documents will look like this in your index:

{"column1": 1, "column2": 2, "column3": 3}
{"column1": 4, "column2": 5, "column3": 6}
{"column1": 7, "column2": 8, "column3": 9}

That's the way it works. So if you want to use curl, you need to do Kibana's pre-parsing job and send the latter documents.

curl -H 'Content-type: application/json' -XPOST iislog/_doc?pipeline=iislog-pipeline -d '{"column1": 1, "column2": 2, "column3": 3}'
Sign up to request clarification or add additional context in comments.

Comments

3

There is another approach to insert CSV into elastic using an ingest pipeline described here: https://www.elastic.co/de/blog/indexing-csv-elasticsearch-ingest-node

In the end, it wraps each line into an json document and grok-parses each line in order to have the csv rows mapped to specific document fields.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.