4

Suppose I have an array of some basic objects in JavaScript:

[
  {a: 'something', b: 'something else', c: 'other' },
  {a: 'something 2', b: 'something else 2', c: 'other 2' },
  // etc.
]

When I reach several hundred thousand of these objects, memory usage is already in the gigabytes under Node.js. How can I make this more efficient?

  • All of the keys are going to be the same for each object, but not necessarily populated in the same order. I suppose I could translate to an array of arrays, but there is overhead in all the conversion back and forth.
  • I know the types of each value in the object ahead of time.

Is there some sort of off-the-shelf in-memory table structure I can use?

I had considered using SQLite3 in-memory but its non-atomic nature prevented me from using it will in my application. Perhaps there is some native JS alternative so that I could re-use it in browsers as well?

10
  • I think your problem might not be "how do I compress these in memory objects" but rather "how can I store these unused variables outside memory and fetch them when needed". I kinda doubt you need hundreds of thousands of these objects in memory at once. Or am I wrong? Commented Jul 9, 2015 at 22:11
  • how about {a: ['something', 'something 2'], b: ['something else', 'something else 2'], c: [...]}? That's quite minimal... 1 object, 3 properties, 3 arrays, all the values (can't use "less" than the values take to store) Commented Jul 9, 2015 at 22:11
  • @Jan There are definitely ways I can store these outside of the app but that's what I'm trying to avoid. Think of an app that checks a few million URLs for availability. Is a relatively small amount of data with no need for long term storage. Adding a dependency of a small database or similar feels over the top for this task. Commented Jul 9, 2015 at 22:14
  • SQLite is .. atomic (the A in ACID), what is that 'non-atomic' bit? Commented Jul 9, 2015 at 22:15
  • I don't think it really matters if you need it long term or it should be deleted when you're done with it, requiring a few gigabytes of free clientside memory for such a "small" task as you call it is not so nice towards the user, and makes your app fragile. Unless this is only run on a production machine with certain guaranteed RAM specs. Just limit the search/execution to one per user, delete the old data when they instigate a new "search" or when the user is logged out, then you can easily implement AJAX pagination of the data or somesuch. Commented Jul 9, 2015 at 22:19

2 Answers 2

2

"several hundred thousand" is a job for an intelligent paging algorithm server side and backed with a database.

You might load pages of the objects from the server when needed by an ajax call and immediately drop them when unused.

Sign up to request clarification or add additional context in comments.

2 Comments

A variant of that is what I was thinking too.
I'm not building a web application here. I was hoping to find a technique that applies to any JS application, but my immediate use case has nothing to do with web applications. Think of a cron job or similar task.
1

Use an array which you index using with object boundary of 3 items per object

data = ['something', 'something else', 'other', 'something 2', 'something else 2', 'other 2'];

function getObjectAt(index){
 var offset = index * 3;
 return { a : data[offset], b : data[offset + 1],  c : data[offset + 2]};
}

The memory footprint should be smaller, of course there are many OTS solutions out there like Redis or hbase.

1 Comment

Not a bad idea. I think I will do this. It's very simple.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.