0

I am looking to upload a very large zip file (several hundred GBs) from my remote server to my google drive using the drive api v3. I tried following a tutorial at Toward Data Science but it defers the use of resumable uploads to the drive api documentation, which isn't very beginner friendly. Other questions on this matter don't handle the file sizes I am handling. They also don't mention the issue of keeping the access-token valid for the time the file is being uploaded. I also found another SO answer during my search. However, it is again a "multi-part" upload method.

Any help would be appreciated. I am looking to automate this using a python script.

Thanks in advance!

4
  • First, I apologize that my answer was not useful for your situation. About your question, are these threads useful? stackoverflow.com/q/61759449/7108653 and stackoverflow.com/q/64587769/7108653 and stackoverflow.com/q/60528771 Commented Feb 16, 2023 at 0:37
  • @Tanaike stackoverflow.com/a/60536138/16476327 is quite relevant. However, I am unable to piece-together the chunking that I'll need to get from the test file to the large file. Where do I add the chunk_size parameter in this answer? Commented Feb 16, 2023 at 17:13
  • Your file size is too big for one shot upload. Uploading files always involves a request timeout, max request size, etc... That's why you endup to documentation talking about resumable uploads. BTW, have a look to the GDrive API, its intended use and service limits. Commented Feb 16, 2023 at 22:21
  • 1
    Thank you for replying. The sample script of stackoverflow.com/a/60536138 uses a single chank. If you want to use multiple chunks, you can see "HTTP - multiple requests" of this official document. Commented Feb 17, 2023 at 1:58

1 Answer 1

0

this is PHP example but maybe help you in php Upload Like this

public function uploadFileChunk($uploadFile)
{
    try {
        if ($uploadFile) {
            $client = $this->getClient();
            $service = new Google_Service_Drive($client);
            $uploadFileExp = explode('\\', $uploadFile);
            $uploadFileName = end($uploadFileExp);
            $file = new Google_Service_Drive_DriveFile();
            $file->setName($uploadFileName);
            $chunkSizeBytes = 100 * 1024 * 1024;
            $client->setDefer(true);
            $request = $service->files->create($file);
            $media = new Google_Http_MediaFileUpload(
                $client,
                $request,
                'application/octet-stream',
                null,
                true,
                $chunkSizeBytes
            );
            $status = false;
            $handle = fopen($uploadFile, "rb");
            $fileSize = filesize($uploadFile);
            $total_records = (int)($fileSize/$chunkSizeBytes);
            $media->setFileSize($fileSize);
            $i = 0;
            while (!$status && !feof($handle)) {
                ++$i;
                $chunk = fread($handle, $chunkSizeBytes);
                $status = $media->nextChunk($chunk);
            }

            $result = false;
            if ($status != false) {
                $result = $status;
            }

            fclose($handle);
            $client->setDefer(false);
            $fileId = $result->id;
            return $fileId;


        }
    } catch (Exception $e) {
        throw $e;
    }
}

And you need Refresh Your access token for uploading large files like 50GB+

if ($client->isAccessTokenExpired()) {

    $client->fetchAccessTokenWithRefreshToken($refreshToken['refresh_token']);

    file_put_contents($credentialsPath, json_encode($client->getAccessToken()));
}
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.