I have a oracle database where large amount of biometric data like HRV and ECG is stored. I need to show this data for each users in excel sheet. But this data is so large, even for a single user we are getting more than 1,00,000 records. There are currently 100~ users.
What I am doing is :
- I execute a CRON job using command line for the same, which I developed in Zend framework.
- I make sure that the this CRON does not overlap.
- I get all the data from oracle database for each user one by one then I store this in an array.
When I get data for all users then I am using phpexcel library to generate excel sheet.
Structure of Excel sheet
- uid 1/ uid/2 uid/3 ----------- nth
- data |data |data |
- data |data |data |
- |
- nth
Problem :
PHP takes 1.5 GB of RAM and stores the data in array and send it to the functions for interacting with phpexcel but this library takes 3-4 hours and then I get "Fatal : Memory limit" error. My system has only 2 GB RAM.
What steps I should take to optimize my code to handle data of this size and displaying the same information in excel format or I need to increase the RAM?