3

The function below was previously triggered by onCreate in firestore but I will prefer it as an (new) onCall. I have tested variations of it and its worked 20% of the time. From what I've read online this means I'm not handling my returns and promises correctly. If anyone can tell me what I'm doing wrong or give me tips, it will be greatly appreciated.

export const DataStoreCall = functions.https.onCall((datas) => {
const filePath = 'goodfile';
const tempLocalFile = path.join(os.tmpdir(), filePath);
var file = fs.createWriteStream(tempLocalFile);
const cors = require('cors')({ origin: true });

const newV = datas;
const db = admin.firestore();

const Id = datas.id;
const UpstRef = db.collection('Review').doc(Id)


https.get(newV.url, function (response) {

    return async function () {
        try {
            response.pipe(file);
            const colPath = 'dataStuff';

            const a = file.path;
            const b = newV.name


            const colRef = db.collection(colPath)
            const batch = db.batch();

            let data;
            if (b.endsWith(".json")) {
                data = await fs.readJSON(a);
            }

            else if (b.endsWith(".csv")) {
                data = await parseCSV(a);
            }

            else if (b.endsWith(".xlsx")) {
                data = await parseXLSX(a);
            }
            else if (b.endsWith(".xlx")) {
                data = await parseXLSX(a);
            }
            else {
                throw "Unknown file extension.";
            }
            for (const item of data) {

                const docRef = colRef.doc();

                batch.set(docRef, item);
            }
            // Commit the batch
            await batch.commit().then(() => {

                //FIRESTORE Update
            })

            console.log("completed!");
        } catch (error) {
            console.log("failed!", error);
        }
    }

    function parseCSV(path): Promise<any> {
        return new Promise((resolve, reject) => {
            let lineCount = 0;

            ///CSV read/parse code                
            resolve(data);                   
                .on("error", err => reject(err));
    }

    function parseXLSX(path): Promise<any> {
        return new Promise((resolve, reject) => {
            //XLSX read/parse code                
            resolve(data);
        })
    }
}) 

I have the blaze plan and have enabled cors/http request on the storage

3
  • When you say it works 20% of the time, what happens when it is not working? What is the expected behavior, and what are you seeing instead? Also, with HTTP functions, you need to send a response, as shown in the guide. Perhaps that is part of the problem. Commented Jun 22, 2018 at 21:48
  • Thanks for the response. I had a variation where I was sending a response from the HTTP, if it does not work it was usually stuck on parseXLSX and parseCSV functions, based on logs. I am not getting any errors in the logs. The function is just not completing. Aside from the HTTP return, is there anything else that needs to be changed? Been working on it for 2 days now. Commented Jun 23, 2018 at 1:05
  • Your mistake might have been in the code you've left out with a comment 'FIRESTORE Update' here you should make sure you have a return statement returning a promise. example 'return db.collection("books").doc().set({foo: "bar"})' Commented Jul 30, 2019 at 6:33

1 Answer 1

4

I found a different & better solution to the function. I realized google cloud storage methods are all applicable to firebase storage. I used creatReadStream to access the file data and send to it firestore.

Below is the code. I've called it about 20 times and it is yet to fail. (NO ERRORS WERE HANDLED, please see docs for it)

exports.getXl = functions.https.onCall((data) => {

const db = admin.firestore();
var filename =data.name
var storage = require('@google-cloud/storage')();
var bucket = storage.bucket('bucketName');
var XLSX = require('xlsx');
var remoteFile = bucket.file(`foldername/${filename}`);
const storename = data.storename
const colRef = db.collection(storename)

const gcsStream = remoteFile.createReadStream();



var allBuffer = new Promise((resolve, reject) => {
  var buffers = [];
  gcsStream.on('data', function (data) {
    buffers.push(data);
  });

  gcsStream.on('end', function () {
    var buffer = Buffer.concat(buffers);
    var workbook = XLSX.read(buffer, {
      type: "buffer"
    });

    var sheetName = workbook.SheetNames[0]

    //CONVERTS STREAM TO JSON
    var abe = XLSX.utils.sheet_to_json(workbook.Sheets[sheetName])
    resolve(abe)
  })

});

//ALL ONREQUEST FUNCTIONS HAVE TO RETURN SOMETHING. 
return allBuffer.then(function (result) {
  const  batch =db.batch()

  //USING BATCH BECUASE CLOUD FUNCTION WILL CUT OFF PROCESS IF ITS WRITING ONE DOC AT A TIME
  //BATCH HAS A LIMIT OF 500 DOCUMENT WRITES AT A TIME SO LOOK UP HOW TO MANAGE CHUNKS IF NECESSARY
  for (const item of result) {
      const docRef = colRef.doc();  
      batch.set(docRef, item);
   }
  batch.commit()
  console.log(result)


})
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.