14

I am trying to create a function where I can pass file path and the read the file in async way. What I found out was that it supports streams()

const fs = require('fs');
var parse = require('csv-parse');
var async = require('async');

readCSVData = async (filePath): Promise<any> => {
    let csvString = '';
    var parser = parse({delimiter: ','}, function (err, data) {
        async.eachSeries(data, function (line, callback) {
            csvString = csvString + line.join(',')+'\n';
            console.log(csvString) // I can see this value getting populated
        })
      });
      fs.createReadStream(filePath).pipe(parser);
}

I got this code from here. but I am new to node js so I am not getting how to use await to get the data once all lines are parsed.

const csvData = await this.util.readCSVData(path)
1
  • is there any promise base csv library is exist right now for node js older versions (i.e nodejs v4.2.0) Commented Jun 6, 2019 at 7:18

2 Answers 2

23

My best workaround for this task is:

const csv = require('csvtojson')
const csvFilePath = 'data.csv'
const array = await csv().fromFile(csvFilePath);
Sign up to request clarification or add additional context in comments.

1 Comment

This answer helped to fix the issue I was facing.
14

This answer provides legacy code that uses async library. Promise-based control flow with async doesn't need this library. Asynchronous processing with async.eachSeries doesn't serve a good purpose inside csv-parse callback because a callback waits for data to be filled with all collected data.

If reading all data into memory is not an issue, CSV stream can be converted to a promise:

const fs = require('fs');
const getStream = require('get-stream');
const parse = require('csv-parse');

readCSVData = async (filePath): Promise<any> => {
  const parseStream = parse({delimiter: ','});
  const data = await getStream.array(fs.createReadStream(filePath).pipe(parseStream));
  return data.map(line => line.join(',')).join('\n');
}

2 Comments

Why is this better than the other answer? Performance?
I can't say about better, but it processes data in chunks, this will waste less memory if a file is big. Also, data should preferably be processed in-place instead of accumulating it into a single big array.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.