7

I'm having some serious problems with the PHP Data Object functions. I'm trying to loop through a sizeable result set (~60k rows, ~1gig) using a buffered query to avoid fetching the whole set.

No matter what I do, the script just hangs on the PDO::query() - it seems the query is running unbuffered (why else would the change in result set size 'fix' the issue?). Here is my code to reproduce the problem:

<?php
$Database = new PDO(
    'mysql:host=localhost;port=3306;dbname=mydatabase',
    'root',
    '',
    array(
        PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION,
        PDO::MYSQL_ATTR_USE_BUFFERED_QUERY => true
    )
);

$rQuery = $Database->query('SELECT id FROM mytable');

// This is never reached because the result set is too large
echo 'Made it through.';

foreach($rQuery as $aRow) {
    print_r($aRow);
}
?>

If I limit the query with some reasonable number, it works fine:

$rQuery = $Database->query('SELECT id FROM mytable LIMIT 10');

I have tried playing with PDO::MYSQL_ATTR_MAX_BUFFER_SIZE and using the PDO::prepare() and PDO::execute() as well (though there are no parameters in the above query), both to no avail. Any help would be appreciated.

3 Answers 3

9

If I understand this right, buffered queries involve telling PHP that you want to wait for the entire result set before you begin processing. Prior to PDO, this was the default and you had to call mysql_unbuffered_query if you wanted to deal with results immediately.

Why this isn't explained on the PDO MySQL driver page, I don't know.

Sign up to request clarification or add additional context in comments.

4 Comments

Wow okay I'm an idiot. Don't know what gave me the opposite impression.
Technically, a "buffered" query means the MySQL client library pulls the whole resultset off the TCP stream before handing it back to you.
Hmm, I thought the manual covered the difference between buffered/unbuffered (mysql style) and fetch/fetchAll (PDO style), but looking again it does not. If you want some more background info, you may find the following useful: netevil.org/blog/2008/06/slides-pdo
The revised link to those slides is: wezfurlong.org/blog/2008/jun/slides-pdo
2

You could try to split it up into chunks that aren't big enough to cause problems:

<?php    
$id = 0;
$rQuery = $Database->query('SELECT id FROM mytable ORDER BY id ASC LIMIT 100');

do {
    stuff($rQuery);
    $id += 100;
} while ( $rQuery = $Database->query(
            'SELECT id FROM mytable ORDER BY id ASC LIMIT 100 OFFSET '.$id
          )
        );
?>

...you get the idea, anyway.

1 Comment

Note that this approach works only for queries on tables that aren't changing - any insert or delete operation can cause you to miss a record or read it twice!
-1

Or maybe you could try mysql functions instead:

while ($row = mysql_fetch_row($query)) {
...
}

Which will definitely be faster, since that foreach statement makes an impression to use fetchAll() instead fetch() each row

1 Comment

mysql_ functions are deprecated and should not be used

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.