7

I have a oracle database where large amount of biometric data like HRV and ECG is stored. I need to show this data for each users in excel sheet. But this data is so large, even for a single user we are getting more than 1,00,000 records. There are currently 100~ users.

What I am doing is :

  1. I execute a CRON job using command line for the same, which I developed in Zend framework.
  2. I make sure that the this CRON does not overlap.
  3. I get all the data from oracle database for each user one by one then I store this in an array.
  4. When I get data for all users then I am using phpexcel library to generate excel sheet.

    Structure of Excel sheet

    • uid 1/ uid/2 uid/3 ----------- nth
    • data |data |data |
    • data |data |data |
    • |
    • nth

Problem :

PHP takes 1.5 GB of RAM and stores the data in array and send it to the functions for interacting with phpexcel but this library takes 3-4 hours and then I get "Fatal : Memory limit" error. My system has only 2 GB RAM.

What steps I should take to optimize my code to handle data of this size and displaying the same information in excel format or I need to increase the RAM?

2
  • What are your users going to do with that much data in Excel? Commented Dec 22, 2012 at 5:00
  • These users are researchers basically. So may be they will use this data in matlab or may be use it to create better algorithms. There are so many possibilities. Basically my question was all about finding optimized way to handle this kind of scenario. Commented Dec 22, 2012 at 5:26

2 Answers 2

7

Consider using PHP only to kick out some kind of Excel-readable delimited file (CSV) containing the result data. (The idea is that you wouldn't be writing to Excel via PHP--because of the limits you are experiencing).

You could request that php generated file via something like Powershell and then have the Powershell or other scripting choice open the csv file straight into Excel and then save as an xlsx.

Or you could have your command line php do a 'system' call and execute the scripting step that opens the csv straight into Excel and then saves it as an xlsx.

Sign up to request clarification or add additional context in comments.

4 Comments

You're right storing this data in CSV will be much better. But suppose size of this file becomes 1 GB while php was writing on it. Then how much ram it will take, I mean it will not cause memory error? When we open 1 GB file in a notepad software we see that it gets hanged, is php is optimized to do so?
PHP doesn't have to have all the data in memory at the same time to write the file. You could have an oracle query return 1000 rows worth of data at a time for all 100 users, not just for one user. Write 1000 rows at a time to the file. Then get the next 1000 rows via Oracle. Append the next 1000 rows, etc. (You can experiment with the number of rows that works best in terms of speed vs. memory consumption).
Nice suggestion but one thing I forget to mention is that I need to use multiple sheets in excel document like HRV data of all users in single sheet and breathing rate in another sheet and I have linux servers here. So CSV conversion to XLS wil work or I need to take the same approach for XLS files.
This (sourceforge.net/projects/excelwriterxml) might help you get around memory limits. Because it seeks to write to xml files, it would not necessarily need to have a whole object model in memory at once, even to write multiple sheets. Also, I've seen very large Excel files handled well by constructing the Excel format according to the OpenXML standard (i.e. not really using an Excel php library, but writing the Excel file yourself. It's more doable than you might think. You only need to create the code to write one sheet, generalize and then you can write multi-sheet files.
2

If a single user data is huge, you are using so much memory to keep this data, try to write it in excel in batch mode. It will save your memory usage.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.