5

When I execute the following code for a user table of about 60,000 records:

mysql_connect("localhost", "root", "");
mysql_select_db("test");

$result = mysql_query("select * from users");

while ($row = mysql_fetch_object($result)) {
  echo(convert(memory_get_usage(true))."\n");
}


function convert($size) {
  $unit=array('b','kb','mb','gb','tb','pb');
  return @round($size/pow(1024,($i=floor(log($size,1024)))),2).' '.$unit[$i];
}

I get the following error:

PHP Fatal error:  Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes)

Any thoughts on how to avoid having the script take up additional memory with each pass through the loop? In my actual code I'm trying to provide a CSV download for a large dataset, with a little PHP pre-processing.

Please don't recommend increasing PHP's memory limit--it's a bad idea and, more importantly, will still create an upward bound on how large a dataset can be processed with this technique.

4
  • Why not try to paginate the query??? Commented Dec 7, 2011 at 1:10
  • 4
    try using php.net/manual/en/function.mysql-unbuffered-query.php Commented Dec 7, 2011 at 1:46
  • 1
    The title of this post is misleading. It's not a memory leak, you are working with a huge result in a way that goes beyond a preset memory limit. Commented Dec 7, 2011 at 4:08
  • @Galled -- I'm dumping this data to CSV, so pagination isn't really an option. Commented Dec 8, 2011 at 2:38

3 Answers 3

2

mysql_query buffers the entire result set into php memory. This is convenient and generally very fast, but you're experiencing a drawback to it.

mysql_unbuffered_query() exists. It doesn't grab the entire result set all at once. It grabs little pieces at a time when you fetch rows from the result set.

Sign up to request clarification or add additional context in comments.

1 Comment

Thanks chris, this is exactly what I was looking for. It seems very odd that the default behavior of MySQL is to tie up additional memory for each row being fetched (even when fetching sequentially and not explicitly storing the data in memory), but there you have it.
1

I'm not 100% sure if this will solve your problem, but have you considered using PDO? It has several advantages; you can read more about them here. If you do go in that direction, there is a similar question about memory usage here.

Comments

0

I have had a similar problem. What I did to get it to work was to create a temporary file (you can use hash or something similar to keep a record of the name).

  • Pull 10,000 rows and put them into a file (temp). Put it into a temp csv file.
  • Reload page (using header with specific parameters / session)
  • Pull another 10,000 rows and append it to a file.
  • When you reach end of the table - buffer file to user.

Go like that in circles until you got it all. I had to do this work around for two reasons,

  1. Timeout
  2. Out of memory error.

Drawbacks of this method is that it requires many HTTP calls to get data. Also in the mean time there could be rows that have changed etc. It is a pretty "dirty" way of doing it. I'm yet to find something that works better. Hope that helps.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.