2

I am using SQS to upload my videos to the S3 bucket in the background. The queue works perfectly fine for small videos (~40 MBs). But, when I try to upload bigger videos (say 70 MBs and more) the queue operation gets killed. Here's my queue operation's output:

vagrant@homestead:~/Laravel/video (master)*$ php artisan queue:work --tries=3
[2017-08-25 17:48:42] Processing: Laravel\Scout\Jobs\MakeSearchable
[2017-08-25 17:48:45] Processed:  Laravel\Scout\Jobs\MakeSearchable
[2017-08-25 17:48:51] Processing: App\Jobs\VideoUploadJob
Killed
vagrant@homestead:~/Laravel/youtube (master)*$ php artisan queue:work --tries=3
[2017-08-25 17:50:33] Processing: App\Jobs\VideoUploadJob
Killed
vagrant@homestead:~/Laravel/video (master)*$ 

Where do I need to change the setting?? Something on Laravel side or on SQS?? Can anyone help me?

1 Answer 1

1

There are 2 options. Either running out of memory or exceeding execution time.

Try $ dmesg | grep php This will show you more details

Increase max_execution_time and/or memory_limit in your php.ini file.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.