0

this is my script, it takes too long , how can I improve the execution time ?

SEE THE ALL OF THE COD: HERE !

$count_nr_attributes = count($nume);
$csv[0] = $csv_array_attributes[0];
for ($i=0; $i < $csv_array_attributes; $i++) { 

        if ($i > 0){
            $final = "";
            for ($j=1; $j < $count_nr_attributes; $j++) { 
                echo "select val_attr from attributes where id_produs = '$csv_array_attributes[$i]' and nume_attr = '$nume[$j]'";echo "<br>";
                $select = mysql_query("select val_attr from attributes where id_produs = '$csv_array_attributes[$i]' and nume_attr = '$nume[$j]'");
                $count = mysql_num_rows($select);
                if ($count == 1){
                    $row = mysql_fetch_array($select);
                    $final .= $row['val_attr']."%%";
                }else{
                    $final .= "no%%";
                }
            }
            echo "<hr>";

        }
        $csv[$i] = $csv_array_attributes[$i]."%%".$final;
    }
//create CSV
$file = fopen("attributes.csv","w+");
    foreach ($csv as $line){
        fputcsv($file,explode('%%',$line),"^","`");
    }

The variable $count_nr_attributes contains more then 2500 values and also the $csv_array_attributes contains more then 2500 values. Actually i have two for loops and the execution time takes too long. How can i improve this ? thx

and the query result always return me one value;

6
  • I take it you meant decrease in the title there... Slightly off-topic, but the mysql_* extension is deprecated, stop using it ASAP. Learn to use mysqli_* and/or PDO Commented Oct 15, 2014 at 7:48
  • deacrease, certanly, srry my mistake Commented Oct 15, 2014 at 7:49
  • Improve your mysql query so you don't have to create 6 250 000 database queries.. Commented Oct 15, 2014 at 7:53
  • 2500 x 2500 = 6.25 million queries. That will certainly slow things down yes. Try to re-think your solution. I don't think there is a quick fix for this. Commented Oct 15, 2014 at 7:53
  • Could you please provide table description (or better yet a link to sqlfiddle.com)? Commented Oct 15, 2014 at 7:53

3 Answers 3

2

Each time you're executing a SQL Query, no matter how many result it returns, it takes some time.

The best you can do to decrease the time your script takes, is to put the query outside the loops, fetch all the results at once, and process it afterward with your loop.

Sign up to request clarification or add additional context in comments.

2 Comments

I know, but I needed inside. because the array($name) represents the name of each column from my csv. and I have to check for each product id if it has the specific attribute name then get the attribute name(val_attr), else put the string "no".
Where do you have changed column names??
1

First of all: don't use mysql_*. Anyway try to redo this with as few queries as possible. I don't have time to analyze all the intentions right now but try something like that:

$count_nr_attributes = count($nume);
$csv[0] = $csv_array_attributes[0];

$query = "
SELECT id_produs, nume_attr, val_attr FROM attributes WHERE 
id_produs IN (".implode(",", $csv_array_attributes).") AND nume_attr BETWEEN 0 AND {$count_nr_attributes} 
ORDER BY id_produs";

$res = mysql_query($query);
$lastProduct = -1;
while($row = mysql_fetch_array($res)) {
  //do whatever you want
  //just check if id_produs is different than the previous one to emulate
  //$csv[$i] = $csv_array_attributes[$i]."%%".$final;
}

3 Comments

I thought about a similar solutions but I do not know if the resulting statement can be that big.
2500 thousand ids is not that much... if needed he can also do some simple batch processing by splitting $csv_array_attributes.
You still run loads of queries. Approach I proposed requires only one which returns huge result.
0

use associative array with key as the id_produs and its value after that inside loop use again associative array to fetch all records relevant to the first associative array hope it will decrease a little bit time. thankx

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.