0

I'm upgrading a backend system that uses require('./file.json') to read a 1GB Json file into a object, and then passes that object to other parts of the system to be used.

I'm aware of two ways to read json files into an object

const fs = require('fs');
const rawdata = fs.readFileSync('file.json');
const data = JSON.parse(rawdata);

and

const data = require('./file.json');

This works fine in older versions of node(12) but not in newer version (14 or 16)

So I need to find another way to get this 1GB big file.json into const data without running into the ERR_STRING_TOO_LONG / Cannot create a string longer than 0x1fffffe8 characters error.

I've seen examples on StackOverflow etc. on how to Stream Huge Json files like this and break it down into smaller objects processing them individually, but this is not what I'm looking for, I need it in one data object so that entire parts of the system that expect a single data object don't have to be refactored to handle a stream.

Note: The Top-level object in the Json file is not an array.

4
  • That works in older versions of node, but not Node16, I get a ERR_STRING_TOO_LONG error on huge Json files. Commented Jul 27, 2021 at 12:36
  • 1
    All this info is already in the main post, it's a 1024MB+ file, and require/JSON.parse doesn't work. Commented Jul 27, 2021 at 12:44
  • Node12 was released in 2019 (Long after that post), it's not a issue there. I don't need to create a string that's huge, just a object that is. (Which works fine in V8) Commented Jul 27, 2021 at 12:56
  • Let us continue this discussion in chat. Commented Jul 27, 2021 at 13:29

2 Answers 2

5

Using big-json solves this problem.

npm install big-json
const fs = require('fs');
const path = require('path');
const json = require('big-json');
 
const readStream = fs.createReadStream('file.json');
const parseStream = json.createParseStream();
 
parseStream.on('data', function(pojo) {
    // => receive reconstructed POJO
});
 
readStream.pipe(parseStream);
Sign up to request clarification or add additional context in comments.

Comments

-2

You need to stream it, so process it in chunks instead of loading it all into memory in a single point in time.


const fs = require("fs");
const stream = fs.createReadStream("file.json");

stream.on("data", (data) => {
    console.log(data.toString());
}); 

1 Comment

I run into the same ERR_STRING_TOO_LONG error.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.