8

I have a large javascript object that I want to convert to JSON and write to a file. I thought I could do this using streams like so

  var fs = require('fs');
  var JSONStream = require('JSONStream');
  var st = JSONStream.stringifyObject()
             .pipe(fs.createWriteStream('./output_file.js'))

  st.write(large_object);

When I try this I get an error:

stream.js:94
  throw er; // Unhandled stream error in pipe.
        ^
TypeError: Invalid non-string/buffer chunk
    at validChunk (_stream_writable.js:153:14)
    at WriteStream.Writable.write (_stream_writable.js:182:12)

So apparently I cant just write an object to this stringifyObject. I'm not sure what the next step is. I need to convert the object to a buffer? Run the object through some conversion stream and pipe it to strinigfyObject

2 Answers 2

3

JSONStream doesn't work that way but since your large object is already loaded into memory there is no point to that.

var fs = require('fs-extra')
var file =   '/tmp/this/path/does/not/exist/file.txt'

fs.outputJson(file, {name: 'JP'},   function (err) {
  console.log(err) // => null
});

That will write the JSON.

If you want to use JSONStream you could do something like this:

var fs = require('fs');                          
var jsonStream = require('JSONStream');          

var fl = fs.createWriteStream('dat.json');       

var out = jsonStream.stringifyObject();          
out.pipe(fl);                                    

obj = { test:10, ok: true };                                    
for (key in obj) out.write([key, obj[key]]);                                                                                
out.end();
Sign up to request clarification or add additional context in comments.

5 Comments

Your first suggestion leads to FATAL ERROR: JS Allocation failed - process out of memory
I just changed the second one to be exact code for your situation unless large is an array. Try that.
I tried the second version however I have a large nested object as one of the obj[key] values thats thats whats throwing the memory allocation error. I'd need something similliar that is recursive for child objects
@kevzettler May I ask how did you solve it with recursion?
I've created a gist that streams the json with a TransformStream: gist.github.com/adrai/713b298fd83da0063910aa9f1674a5ed
2

Well the question is quite old but still valid for nowadays, I faced same issue but solved it using this JsonStreamStringify package.

const { JsonStreamStringify } = require("json-stream-stringify");

Now,

x = new JsonStreamStringify(cursor).pipe(res);
x.on("data", (doc) => {
    res.write(doc);
  });

Here you can read your file using fs and then write the above code. 'cursor' will be pointing to your file.

In this way, you can stream your file in valid JSON Format.

For Docs: https://www.npmjs.com/package/json-stream-stringify

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.