0

Usually I'd add a for each loop for a mysql database entry in PHP, but when all the information is in one array (multidimensional i think?)

I'm using http://www.phpclasses.org/quick_csv_import to parse a CSV file which is working fine, but I can't for the life of me work out how to put the results as they happen in a loop to write to the database.

<?php

$con = mysql_connect("localhost","*****","****");
if (!$con)
  {
  die('Could not connect: ' . mysql_error());
  }

  mysql_select_db("*****", $con);

require_once('parsecsv.lib.php');
$csv = new parseCSV();
$csv->auto('prod2.csv');

$csv->titles as $value (this has the titles saved in it in an array)

$csv->data as $key => $row (this has the actual values in it in an array)


?>

Edit: The array looks like:

  [0] => Array
        (
            [ProductCategory] => Cleats and Spikes
            [ProductSubCategory] => Masters Cleats
            [Brand] => Masters
            [ItemNumber] => SAM02BY
            [ShortCode] => SAM0
            [ItemDescription] => Ultra Grip F/T Pk (16) Blk/Yel
            [ProductName] => Ultra Grip  (6mm, F/T, Q/F)                                                                                                                                                                                                                                    
            [Options] => Fast Twist
            [ImageURL] => hSAM02BY.jpg
            [YoutubeURL] => http://www.youtube.com/watch?v=WuSB6rY9myg
            [HostedURL] => 27527890
            [StockStatus] => In Stock
            [QtyAvailable] => 2149
            [RRP] => 8.99
            [Barcode] => 5027084199630
            [DeliveryDue] => 09/03/2012
        )

Any tips will be greatly appreciated

3
  • So, what have you tried? All you need to do is use a looping construct to iterate over $csv->data. You can then either insert them one by one in your loop or create one large query and insert at the end. Commented Mar 28, 2012 at 16:44
  • Show the result of print_r($csv->data) so we can see the structure of it. Commented Mar 28, 2012 at 16:46
  • I've updated the question with the print. Thanks Commented Mar 28, 2012 at 18:30

2 Answers 2

2

I would load the data directly into MySQL using LOAD DATA INFILE -

LOAD DATA INFILE 'prod2.csv' INTO TABLE tbl_name
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    LINES TERMINATED BY '\r\n'
    IGNORE 1 LINES;

This assumes that you have appropriate permissions on the server. prod2.csv would need to be replaced with the full path to the file on the server.

To process the CSV in PHP you could use something like this -

<?php

$dsn = 'mysql:dbname=testdb;host=127.0.0.1';
$user = 'dbuser';
$password = 'dbpass';

try {
    $dbh = new PDO($dsn, $user, $password);
} catch (PDOException $e) {
    echo 'Connection failed: ' . $e->getMessage();
}

$sql = 'INSERT INTO `table` (ProductCategory, ProductSubCategory, Brand, ...)
        VALUES (?,?,?, ...)
        ON DUPLICATE KEY UPDATE ProductCategory = VALUES(ProductCategory), ProductSubCategory = VALUES(ProductSubCategory), Brand = VALUES(Brand), ...';

$sth = $dbh->prepare($sql);

require_once 'parsecsv.lib.php';
$csv = new parseCSV();
$csv->auto('prod2.csv');

foreach ($csv->data as $row) {

    // bind params and execute query
    $sth->execute(array($row['ProductCategory'], $row['ProductSubCategory'], $row['Brand']));

}

It is obviously incomplete as you have not included the details of the tables involved. I would advise using LOAD DATA INFILE if you can, but this will work once you have filled in the blanks. Note that I have written this example using PDO instead of the old mysql_* functions. The repeated execution of the query will have slightly improved performance due to the prepared statement.

Sign up to request clarification or add additional context in comments.

5 Comments

+1. If you don't have the FILE privilege on the server, you can use "LOAD DATA LOCAL INFILE" instead.
Hey, thanks for that. However I do want to do it through PHP using a cron as it'll have to be run often and on shared hosting
Have you checked to see if your shared hosting environment allows LOAD DATA INFILE? Do you use Barcode as the unique identifier? What is the table structure that you are updating? Even if you want to do it hourly from cron I would still opt for LOAD DATA INFILE, if it is available to you, as it is fast.
I don't have SSH access i'm afraid, wouldn't I need SSH?
No. It is just a SQL command. Try uploading a small csv file and loading it into a dummy table. You can run the LOAD DATA INFILE query just the same as any other query, if the server allows. Please update your question with the other details - table structures, unique identifiers, etc
0

I just did a for each loop on the individiual items in the array $whatever['item'] etc

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.