0

I'm trying to do some mapping/reducing on some javascript object and failing miserably.

The data as it comes from the backend comes in looking something like this:

[
  {table:"a1", data: {colA:1,colB:2,colC:3}},
  {table:"a2", data: {colA:2,colB:3,colC:4}},
  {table:"a3", data: {colA:3,colB:4,colC:5}}
]

Recharts needs the data in the following format (make the source data keys as the unique "name" key for the result)

[
  {name: 'colA', a1: 1, a2: 2, a3: 3},
  {name: 'colB', a1: 2, a2: 3, a3: 4},
  {name: 'colC', a1: 3, a2: 4, a3: 5}
]

My current solution is currently O(n^n) because I'm building an result object, and looping over it every time. I'm using ECMA6/Babel as well as Lodash. Any guidance would be greatly appreciated! Thanks!

Edit: Here's my current solution

var dest = []
Lodash.forEach(source,(o) => {
  var table = o.table;
  Lodash.forEach(o.data, (p,q) => {
    // See if the element is in the array
    const index = Lodash.findIndex(dest,(a) => {return a.name === q});
    if ( index === -1) {
      var obj = {};
      obj[table] = Number(p);
      obj.name = q;
      dest.push(obj);
    } else {
      dest[index][table] = Number(p);
    }
  })
});
4
  • 1
    What is the exact problem? If you need performance, use raw for. Commented May 10, 2017 at 19:34
  • 3
    Could you post your current solution? Commented May 10, 2017 at 19:35
  • Please show us your current code. Commented May 10, 2017 at 19:36
  • Your solution has complexity O(n² m), not O(n^n) Commented May 10, 2017 at 22:16

2 Answers 2

1

To start, the speed of your algorithm is actually O(n^n^n) because you have 3 loops nested. The way I would simplify it is to use an object first, and then create an array from the object. Like so:

function convert(original) {
    var tmp = {};
    for(var tableIndex in original) {
        var tableObj = original[tableIndex];
        for(var colKey in tableObj.data) {
            var col = tableObj.data[colKey];
            if(tmp[colKey] === undefined) {
                tmp[colKey] = {name: colKey};
            }
            tmp[colKey][tableObj.table] = col
        }
    }

    var output = [];
    for(var index in tmp) {
        output.push(tmp[index]);
    }

    return output;
}

var original = [
  {table:"a1", data: {colA:1,colB:2,colC:3}},
  {table:"a2", data: {colA:2,colB:3,colC:4}},
  {table:"a3", data: {colA:3,colB:4,colC:5}}
]

console.log(convert(original));

This will eliminate the need to loop over the result array to add to the objects. You still have a O(n^m) + O(l) condition when iterating over the input array and it's data, but you don't have an even more complex speed condition by also iterating over the result array on every iteration as well.

This function can also handle instances where your data might be different per table. So you may have colA and colB but not colC in one of the entries, for example. Or you may have a colD on another entry.

Sign up to request clarification or add additional context in comments.

Comments

1

You can massively simplify this if you use a map to track the eventual columns. By storing them in a map as you go, you gain the benefit of constant lookup time.

If we say that the number of tables is N and the number of columns is M then you'll get O(N*M).

let input = [
  { table: "a1", data: { colA: 1, colB: 2, colC: 3 } },
  { table: "a2", data: { colA: 2, colB: 3, colC: 4 } },
  { table: "a3", data: { colA: 3, colB: 4, colC: 5 } }
];

let desiredOutput = [
  { name: 'colA', a1: 1, a2: 2, a3: 3 },
  { name: 'colB', a1: 2, a2: 3, a3: 4 },
  { name: 'colC', a1: 3, a2: 4, a3: 5 }
];

let keys = null;
let map = null;

input.forEach(row => {
  if (map === null) {
    // Cache the column names
    keys = Object.keys(row.data);
    
    // Generates objects such a `{ name: 'colA' }`
    // and stores them at a key of 'colA'
    map = keys
      .reduce((o, k) => (o[k] = {
        name: k
      }, o), {});
  }

  // For each column ('colA', 'colB', etc.)
  keys.forEach(key => {
    // Create a new property for the table name
    // ('a1', 'a2', etc.)
    // and copy the matching column value from the input
    map[key][row.table] = row.data[key];
  });
});

// Convert the map to an array of just the values
let output = Object.values(map);
console.log(JSON.stringify(output) === JSON.stringify(desiredOutput));
console.log(output);

2 Comments

Nice use of some helper methods such as reduce and forEach. Just be aware that they do produce their own overhead as well. I speed tested both of our solutions, and yours clocks in at ~1500ms for 1,000,000 iterations where mine clocks in at ~900ms for the same number of iterations. The only difference between ours is that you use the helper methods while I use more vanilla javascript. Just be careful of convenience.
@PeterLaBanca Quite true. If the dataset is significantly large and speed is absolutely required then helper methods should be avoided.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.