1

I'm trying to use JSON as some kind of database for my application. So the user adds an input and it's written to the JSON file, then when I need that information I can loop through the JSON. I'm using 'fs' to write the JSON format of my object inside file:

fs.appendFile('log.json',  JSON.stringify(customer) + '\n' , function(err) {
    if (err) {
        return console.log(err);
    }
    console.log('saved in log.json');
});

So my log.json looks like this:

{"name":"John","email":"[email protected]"}
{"name":"Peter","email":"[email protected]"}

This is still can be used if I read each line and convert it to object, but obviously it's not a valid JSON file and it'd be better if I can have this as output:

{
{"name":"John","email":"[email protected]"},
{"name":"Peter","email":"[email protected]"}
}

or even better:

[
{"name":"John","email":"[email protected]"},
{"name":"Peter","email":"[email protected]"}
]

So technically all I want to do is keeping {} and append my text inside curly braces. Please note that I can't store all of the inputs in an array/object and then write that in my log file. I want an external-module-free method to update my log.json file every time user enters new information.

8
  • Maybe you would want to look at mongodb and how node.js does it? Commented May 6, 2015 at 1:51
  • Have you considered SQLite? It works with the same premise of keeping a simple text file as a database, and will be much faster when running as well as faster to implement. Commented May 6, 2015 at 1:51
  • Can you explain that last paragraph a bit more? The approach I would take is to read in the input file each time, parse it as JSON, push an entry to the returned array and then write the entire file back as JSON. Obviously, if the file will get large, this won't hold up as a good solution. Commented May 6, 2015 at 1:52
  • well using a db is not an option here. I'm trying to write an app to use json as a db for Azure, since apparently Microsoft doesn't want to add one for student accounts. Commented May 6, 2015 at 1:53
  • @GregL Yes that would be an option if the server runs constantly. All of the data would be gone if it restarts for any reason. Commented May 6, 2015 at 1:55

2 Answers 2

1

I came up with this solution, using regex to replace closing bracket for each new entry:

if (data.length < 5) {
    console.log(data.length);
    new_json = JSON.stringify(customer) + '\n' + ']';
    result = data.replace(/\]$/g, new_json);
} else {
    new_json = ',' + JSON.stringify(customer) + '\n' + ']';
    result = data.replace(/\]$/g, new_json);
}

then you can parse the file using JSON.parse

fs.writeFile('log.json', result, 'utf8', function(err) {
    if (err) throw err;
        fs.readFile('log.json', 'utf8', function (err, data) {
            if (err) throw err;
            console.log(JSON.parse(data));
        });
});
Sign up to request clarification or add additional context in comments.

Comments

0

I would definitely recommend a database. If you really want to do it this way though, you could open the file, seek backwards from the end until you find ], write your new data, and write ] again.

1 Comment

You haven't taken into account the commas.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.