1

I'm trying to upload excel file to azure storage blob in chunks, using the stage block and commitblock from BlobBlockClient Class. File upload seems to success but when i try to download and open the file, there it seems to be broken.

I'm using react and node js to do this. Code follows below

In UI

const chunkSize = (1024 * 1024) * 25;  // file chunk size

// here slicing the file and sending it to api method

const fileReader = new FileReader();
const from = currentChunkIndexRef.current * chunkSize;
const to = from + chunkSize;
const blob = file.slice(from, to);

fileReader.onload = ((e: any) => uploadChunksToBlob(e, file, obj));
fileReader.readAsDataURL(blob);

// api method

const uploadChunksToBlob = async (event: any, file: File, obj: any) => {
try {
  const totalChunks = Math.ceil(file.size / chunkSize);
  const uploadChunkURL = `/upload?currentChunk=${currentChunkIndexRef.current}&totalChunks=${totalChunks}&file=${file.name}&type=${file.type}`;
  console.log(event.target.result)
  const fileUpload = await fetch(uploadChunkURL, {
    method: "POST",
    headers: { "Content-Type": "application/octet-stream" },
    body: JSON.stringify(event.target.result),
  });
  const fileUploadJson = await fileUpload.json();
  const isLastChunk = (totalChunks - 1) === currentChunkIndexRef.current;
  if(!isLastChunk) {
    console.log({ Chunk: currentChunkIndexRef.current });
    currentChunkIndexRef.current = currentChunkIndexRef.current + 1;
    // eslint-disable-next-line @typescript-eslint/no-use-before-define
    uploadFileToAzureBlob(file, obj);
  } else {
    console.log("File Uploaded")
  }
  // 
} catch (error) {
  console.log("uploadFileToAzureBlob Catch Error" + error);
}

}

// In Node

const sharedKeyCredential = new StorageSharedKeyCredential(
config.StorageAccountName,
config.StorageAccountAccessKey
);
const pipeline = newPipeline(sharedKeyCredential);
const blobServiceClient = new BlobServiceClient(
`https://${config.StorageAccountName}.blob.core.windows.net`,
 pipeline
 );
const containerName = getContainerName(req.headers.key, req.headers.clientcode);
const identifier = uuid.v4();
const blobName = getBlobName(identifier, file);

const containerClient = blobServiceClient.getContainerClient(containerName);
const blockBlobClient = containerClient.getBlockBlobClient(blobName);


try {

   let bufferObj = Buffer.from(`${file}_${Number(currentChunk)}`, "utf8"); // Create buffer object, specifying utf8 as encoding

   let base64String = bufferObj.toString("base64"); // Encode the Buffer as a base64 string

   blockIds = [...blockIds, base64String];
   const bufferedData = Buffer.from(req.body);

   let resultOfUnitArray = new Uint8Array(bufferedData.length);
   for (let j = 0; j < bufferedData.length; j++) {
     resultOfUnitArray[j] = bufferedData.toString().charCodeAt(j);
   } // Converting string to bytes
   const stageBlockResponse = await blockBlobClient.stageBlock(base64String, resultOfUnitArray, resultOfUnitArray.length, {
  onProgress: (e) => {
    console.log("bytes sent: " + e.loadedBytes);
  }
});
   if ((Number(totalChunks) - 1) === (Number(currentChunk))) {
     const commitblockResponse = await blockBlobClient.commitBlockList(blockIds, {blobHTTPHeaders: req.headers});
     res.json({ uuid: identifier, message: 'File uploaded to Azure Blob storage.' });
   } else {
     res.json({ message: `Current Chunks ${currentChunk} is Successfully Uploaded` });
   }
} catch (err) {
   console.log({ err })
   res.json({ message: err.message });
}

I don't know, what i'm doing wrong here.

Any help would be appreciated Thank you

5
  • You don’t need to go through FileReader. Just send the chunk blob directly, likefetch({ body: blob }). Currently you convert it to dataURL that’s where thing breaks. Commented Dec 6, 2022 at 12:20
  • @hackape thank you for the comment, but still same issue Commented Dec 6, 2022 at 12:46
  • Make sure you also change the node.js side accordingly. Now that the blob is sent directly from client, thus const bufferedData = Buffer.from(req.body) IS THE BLOB, so you don't have to move bytes onto resultOfUnitArray. In other word, just use bufferedData in place of resultOfUnitArray. Commented Dec 6, 2022 at 19:48
  • And please, update your question description if you encounter further problem, provide useful info, "but still same issue" doesn't tell much. Commented Dec 6, 2022 at 19:51
  • 1
    @hackape thanks for the help, now sending the actual blob it's working.. Note: Please enter the answer so i can upvote. but now i ran into new issue i.e the specified block list is invald and for this i will raise a new question Commented Dec 7, 2022 at 7:31

1 Answer 1

1

The problem is that you convert it into dataURL, that’s where things break.

It appears to me that you're under the wrong impression that you need to first encode a blob into string in order to send it. Well, you don't have to, browser fetch API is capable to handle raw binary payload.

So on the client (browser) side, you don’t need to go through FileReader. Just send the chunk blob directly.

const blob = file.slice(from, to);
// ...

fetch(uploadChunkURL, {
  method: "POST",
  headers: { "Content-Type": "application/octet-stream" },
  body: blob,
});

On the server (node.js) side, you'll receive the blob in raw binary form, so you can simply forward that blob untouched to azure storage. There's no need to decode from string and move bytes onto resultOfUnitArray like you currently do.

const base64String = Buffer.from(`${file}_${Number(currentChunk)}`, "utf8").toString("base64");
const bufferedData = Buffer.from(req.body);
const stageBlockResponse = await blockBlobClient.stageBlock(
  base64String,
  bufferedData,
  bufferedData.length
);
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.