0

I'm actually running a JS script (ES6 and Guzzle) in the browser (it have to run in a browser, no NodeJS). This script is calling some xml files, and store results for later usage (I output the convert and output then process it to be able to import it in a database).

So this script will generate an array containing thousands and thousands of small JS object (from XML parsing).

As the script take really long to run, I'm looping on my URL array (I have a list of all file URL), and storing query result into a classic JS variable, and local storage after jsonEncode. As it's JSON-encoded, the localStorage value is erased every time and a new bigger string is saved for the same key.

My question :

  • Is it better to use only a classic variable? Or only the local storage?
  • Is there any other way to store a large amount of data for a script? (temporary blob, text file, DOM append...)

From my tests, after 3-4k files queried and result stored, the browser starts to slow down a lot and drastically reduce the number of HTTP request/minutes.

Thanks !

Notes :

  • It have to run in a browser (I need some dynamic DOM data, it's an internal dashboard that display stats, with user inputs for live settings).
  • It need to run only on latest Chrome or Firefox
6
  • 1
    if you can use jQuery Ajax why can't you post to an API and drop that data there? Commented Aug 22, 2018 at 16:16
  • 1
    Why not just store each object in its own entry? Commented Aug 22, 2018 at 16:24
  • Yes I could use an API... I wanted to prevent the dev of it :) what do you mean each object in its own entry @Jonas Commented Aug 22, 2018 at 16:31
  • "storing query result into a classic JS variable" - not sure what you mean. Can you show us the code? Commented Aug 22, 2018 at 19:09
  • Simple like var MyVar = []; then MyVar.push(). That's what I'm calling "simple" variable. Commented Aug 23, 2018 at 9:50

1 Answer 1

1

the localStorage value is erased every time and a new bigger string is saved for the same key.

This deserialization-append-serialization process is what slows down the page. Instead you could store each entry in its own key, that way appending is much more performant:

  class PersistentArray  {
    constructor(name) {
      this.name = name;
      this.length = +localStorage.getItem(name) || 0;
    }

    push(value) {
     set(this.length, value);
    }

    set(index, value) {
      if(index >= this.length)
        localStorage.setItem(this.name, this.length = index + 1);
      localStorage.setItem(this.name + index, JSON.stringify(value));
    }

    get(index) {
      return JSON.parse(localStorage.getItem(this.name + index));
    }

    *[Symbol.iterator] () {
       for(let i = 0; i < this.length; i++)
           yield this.get(i);
    }
 }

That way you can easily push values as:

  const pages = new PersistentArray("pages");

  // ... somewhen
  pages.push({ value: "whatever" });

When all the data is there:

  // Turn into a real in memoy array:
  const result = [...pages];

  // Dynamically load:
  for(const page of pages) 
    console.log(page);
Sign up to request clarification or add additional context in comments.

2 Comments

Really nice, I'm gonna try, see then results and I'm getting back to you.
Works like a charm, it's a lot faster thanks. But in the end, using another PHP API I the better way, as there is too much data for a local storage. Still, your solution fixed the lagging issue using local storage.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.