0

My scenario is like this ,i have a huge dataset fetched from mysql table

$data = $somearray; //say the number of records in this array is 200000

iam looping this data,processing some functionalities and writing this data to an excel file

 $my_file = 'somefile.csv';
 $handle = fopen($my_file, 'w') or die('Cannot open file:  ' . $my_file); file
     for($i=0;$i<count($data);$i++){
         //do something with the data
         self::someOtherFunctionalities($data[$i]); //just some function    
         fwrite($handle, $data[$i]['index']); //here iam writing this data to a file
      }
 fclose($handle);

My problem is that the loop gets memory exhaustion ...it shows "fatal error allowed memory size of.." is there anyway to process this loop without exhaustion

Due to the server limitation im unable to increase php memory limit like

ini_set("memory_limit","2048M");

Im not concerned about the time it takes..even if it takes hours..so i did set_time_limit(0)

9
  • Can't you read the MySQL data in batches or process it row by row rather then saving the whole set to a variable? Commented Sep 5, 2014 at 8:30
  • I have tried batch processing also...but it still hangs up...is there someway we can release the memory at the end of each loop Commented Sep 5, 2014 at 8:33
  • To release the memory just set the variable to null, the garbage collector should handle the memory. Commented Sep 5, 2014 at 8:35
  • Do you need to fetch all data into memory? Can't you fetch one row at a time? Commented Sep 5, 2014 at 8:35
  • Are you using PDO to fetch the data? Commented Sep 5, 2014 at 8:40

3 Answers 3

5

your job is linear and you don't need load all data. use Unbuffered Query also use php://stdout(don't temp file) if send this file to httpClient.

<?php
$mysqli  = new mysqli("localhost", "my_user", "my_password", "world");
$uresult = $mysqli->query("SELECT Name FROM City", MYSQLI_USE_RESULT);
$my_file = 'somefile.csv'; // php://stdout
$handle = fopen($my_file, 'w') or die('Cannot open file:  ' . $my_file); file

if ($uresult) {
   while ($row = $uresult->fetch_assoc()) {
   // $row=$data[i]
      self::someOtherFunctionalities($row); //just some function    
     fwrite($handle, $row['index']); //here iam writing this data to a file
   }
}
$uresult->close();
?>
Sign up to request clarification or add additional context in comments.

Comments

2

Can you use "LIMIT" in your MySQL query?

The LIMIT clause can be used to constrain the number of rows returned by the SELECT statement. LIMIT takes one or two numeric arguments, which must both be nonnegative integer constants (except when using prepared statements).

With two arguments, the first argument specifies the offset of the first row to return, and the second specifies the maximum number of rows to return. The offset of the initial row is 0 (not 1):

SELECT * FROM tbl LIMIT 5,10; # Retrieve rows 6-15

http://dev.mysql.com/doc/refman/5.0/en/select.html

Comments

1

If you don't worry about time, take 1000 rows at a time, and just append the rows to the end of the file, eg. make a temp file that you move and/or rename when the job is done.

First select count(*) from table
  then for($i = 0; i < number of row; i = i + 1000){
  result = SELECT * FROM table LIMIT i,1000; # Retrieve rows 6-15
  append to file = result
}
move and rename the file

this is verry meta code, but the process should work

2 Comments

The query needs order by, else it's not guaranteed it will select the next batch. It also assumes the data does not change.
Feel free to use the edit button to improve my suggestion, still as i said, it is just a list form on how to do it, not a working example.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.