I have a curl script which reads the data from a remote source. Below is the current code:
function download_page($path){
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$path);
curl_setopt($ch, CURLOPT_FAILONERROR,1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION,1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch, CURLOPT_TIMEOUT, 15);
$retValue = curl_exec($ch);
curl_close($ch);
return $retValue;
}
$sXML = download_page('http://remotepage.php&function=getItems&count=100&page=1');
$oXML = new SimpleXMLElement($sXML);
foreach($oXML->results->item->item as $oEntry){
$insert_query = mysql_query("INSERT INTO tbl_item (first_name, last_name, date_added) VALUES ('" . $oEntry->firstname . "', '" . $oEntry->lastname . "', '" . date("Y-m-d H:i:s") . "')");
}
The script works however is extremely slow to insert as I imagine its because its writing each individual record. The count variable is how many records returned for each page and the page variable is a simple page counter.
I am wondering if there is a way to do a bulk insert statement to insert all 100 records at once.
Thanks in advance.
INSERTthat is slow and not thecURLrequest? Tryechoing some timing out to the page to make sure you know where its slowing down.mysql_real_escape_string) and precompile the query so it will be faster if you execute multipleINSERTstatements.