0

My server can be PHP or nodeJS, prefer staying in PHP but if it can't be done/better in node i'll appreciate the answer.

I want to generate and export a very big CSV file, the problem is that I can't load all the data from MySQL once, so currently I limit my data to some amount which doesn't crash the app, but it's slow and most data won't be exported.

I thought of this solutions to generate and export the CSV file:

  1. Send 1..N calls to my server, each call will generate Part1...PartN CSV and the browser will append them [won't the browser crash?]
  2. In one request, stream the data to the browser, but then how does the browser start downloading the file?

I don't mind the client will wait a lot, just want a good way to export ~200MB csv from the data I got in MySQL.

Thanks.

1 Answer 1

2

This is a nice solution using Mysql notation:

SELECT *
INTO OUTFILE '/tmp/dbexport.csv' 
FIELDS ENCLOSED BY '"' 
TERMINATED BY ';' 
ESCAPED BY '"' 
LINES TERMINATED BY '\r\n'
FROM tables;

Have a look at this answer too.

You could consider also use a .tsb (tab separated file) it does make no difference for you:

mysql your_database -e "select * from tables" -B > export.tsv 
Sign up to request clarification or add additional context in comments.

4 Comments

Its a great answer but i need to process the data in the server before sending it back to the browser, any other idea?
I do not get what you mean, this way everything happens in the server. Results are in your server, in the /tmp/ folder, in the dbexport.csv file, so you can parse it in the server.
Parsing such a large file causes memory exhaustion, because the array which fills it get too large for the app to hold. To solve this i can read only part of the file at a time, but then i don't need the file, i can get run a partial queries and process it, save to a multiply files / append to a single file, and then upload it somehow (?) back to the client browser. if you know how to send a 200mb~ file to the browser i'll be grateful
in order to parse the data small files will help you with memory. Then, in order to flush 200Mb of data to browser, I see pagination is the best for you. Sort your data somehow, to be sent in slices (20 to 100 elements at a time), then organize requests like /mydbcontent/?ini=0&size=100 and flush it in arrays of json data.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.