2

Im currently inserting a large amount (100k+) of rows into a table as follows:

//START of a loop

$updatedData[] =  '(
                      "'.$field_1.'",
                      "'.$field_2.'",
                      "'.$field_3.'"
                   )';
//END of loop

$updatedData = implode(',', $updatedData);

$query = "INSERT INTO table (field_1,field_2,field_3) VALUES $updatedData";

This is all working perfectly, but now I'm looking for a way to update a row instead of insert a new one if $field_1 and $field_3 are duplicates.

3
  • Please post the CREATE TABLE statement for the TRACKING table. ON DUPLICATE KEY is based on the primary key Commented Nov 20, 2010 at 1:11
  • they key is just and incremental column, whether the row should be inserted or updated depends on the values in field_1 and field_3. Commented Nov 20, 2010 at 1:15
  • Please post the CREATE TABLE statements for both tables. Commented Nov 20, 2010 at 21:44

4 Answers 4

2

If $field_1 and $field_3 are PRIMARY KEY or UNIQUE index then you could use

$query = "REPLACE INTO table (field_1,field_2,field_3) VALUES $updatedData";

--edited--

To do what i think you are looking, you would need something like this...

INSERT INTO table VALUES $updatedData ON DUPLICATE KEY UPDATE field2=VALUES(field2)

PD: don't forget about max_allowed_packet for large queries.

Sign up to request clarification or add additional context in comments.

3 Comments

this seems like the easiest way to go, but I read in the manual that REAPLACE INTO deletes then inserts where as UPDATE just updates. Wouldn't this have a negative performance impact on large queries?
Yes, "REPLACE INTO" deletes before insert for duplicate rows, and it's slower than "UPDATE" but "ON DUPLICATE KEY UPDATE" is a much slower. I use "REPLACE INTO" only for MyISAM tables, because in InnoDB it has side effects.
Was looking for similar solution and your VALUES() answer hit the spot. (Needed to be able to reference the values in the insert statement.
1

Just take counter of your $updatedData.

And run that in loop this will run as you want.

All the best.

Kanji

Comments

0

You could try an update statement like this:

UPDATE table
SET    field2 = '$field2' 
WHERE  field1 = '$field1' 
AND    field3 = '$field3'

Then check whether the number of rows updated was zero. If so, do the insert you already have working. If you've got a lot of data, you probably want some indexes on fields one and three.

If there's any chance of quotes in your input data, you should use the functions for sanitizing it before building the query. Don't want to get SQL injection attacks from your data migration.

Update: I missed that your $updatedData was built in a loop and held all the rows at once. My solution only works on one row at a time. The best I can think of is to run through all the rows and try an update like I described above. If the number of rows updated is zero, then add the row to $insertedData just like you added to $updatedData in your question. Then you can do a mass insert at the end like you already have working.

2 Comments

Even still, how would I mass update with one query? Your example only shows 1 row being updated.
Sorry, @websiteguru, I missed that $updatedData was built in a loop. I've updated my answer with a hybrid solution. That's the best I can think of.
-1

I found this on the mysql documentation, don't know if it works

UPDATE tbl_name SET fld2 = CASE fld1
WHEN val1 THEN data1
WHEN val2 THEN data2
ELSE fld2 END

2 Comments

This would only update one row not multiple. How would I update multiple rows with one query?
It seems like this would generate a huge case statement for 100,000 rows. I'd be worried about performance. Also, how would you know which rows had successfully been updated and which ones still need to be inserted?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.