1

I am having trouble with allowing users to download a large report from the data base.

Users have the ability to browse the report on the website, i have set up a limit and a pagination for the pages so no problem there.

But now i am adding the functionality to download the whole report at once in a csv file. I receive memory error saying i have reach the maximum memory allowed, and i don't have permission on the server to increase it. Does anybody know how can make this possible? so far just getting the query which has 700k result stops my script.

ps i have search around stackoverflow and so far can't find an answer to it. I am using php and mysql to get the data.

Thanks in advance

4 Answers 4

3

Do not export whole data in a single request.

Apply some limit and offset in exporting data from database.

Like :

  • in first go it will export data from 0 to 9999, append to csv, and redirect to same url with offset = 10000 and limit =10000
  • in second go it will export data from 10000 to 19999, and append to csv, and redirect to same url with offset = 20000 and limit =10000
  • and so on
Sign up to request clarification or add additional context in comments.

4 Comments

how can i redirect? if i have already sent the header to download, it needs to be all in one file.
Your script must work on url. I mean If I click on some link, it should export the data. Initially 10000 records. Then again set header to same url as previous link. but now offset will be different. If all records has been exported then download the required file.
This might appear to solve the problem temporarily, but the approach does not change the size of the result set returned by MySQL (and which is mapped in PHP memory space when the query is executed - the functions/methods retrieving an individual row operate on the local copy - they do not go back to the DBMS)
i guess i missunderstood your notion, so i append all the data to an external file, then redirect to that file location, thank you. i will try that
1

As per my comment on Guarav's answer you can't fetch that much data from the database in a single go - it doesn't matter how clever you are in how you manipulate it later.

You must either

  • restrict the amount of data fetched from mysql in a single operation - and potentially kepp repeating this until you've the full data set - and as Guarav suggests you'll need to keep the data outside of PHP's memory space in a file then redirect to it. But given that the process is going to be slow already this is not a scalable solution either

how can i redirect? if i have already sent

This is trivial - if you're having problems understanding how to do this then you're already massively out of your depth.

  • alternatively you could generate the report outside of the webserver's PHP but if you don't have permission to configure the webserver then you'll probably not be able to run programs either.

Comments

0

Just like @symbcbean said, the process of creating a file and appending the data with each query offset is very slow. So i came up with an alternative that i thought i should share in case someone else is facing the same issue.

I have setup a cronjob that goes through the process of creating the file with all the necessary data in it at night, since the data only change a few times a week. So now i've cleared the overhead of generating a new csv everytime someone needs to access it, instead i simply redirect to the pre-made file and the work is done!

Thank you @Gaurav & @symbcbean!

Comments

0

you can append data to a file in bunch of records Something like : Get first 10000 record from mysql Write first 10000 record to file Get second 10000 record Write second 10000 record to file Then crate link and send it to user.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.