8

How do I insert for example 100 000 rows into MySQL table with a single query?

1
  • The obvious question is "from what?". If the answer is from memory, then start typing an SQL file very carefully. Otherwise, use an appropriate language and environment. Commented May 21, 2009 at 9:43

4 Answers 4

8
insert into $table values (1, a, b), (2, c, d), (3, e, f);

That will perform an insertion of 3 rows. Continue as needed to reach 100,000. I do blocks of ~1,000 that way when doing ETL work.

If your data is statically in a file, transforming it and using load data infile will be the best method, but I'm guessing you're asking this because you do something similar.

Also note what somebody else said about the max_allowed_packet size limiting the length of your query.

Sign up to request clarification or add additional context in comments.

Comments

3

You can do a batch insert with the INSERT statement, but your query can't be bigger than (slightly less than) max_allowed_packet.

For 100k rows, depending on the size of the rows, you'll probably exceed this.

One way would be to split it up into several chunks. This is probably a good idea anyway.

Alternatively you can use LOAD DATA INFILE (or LOAD DATA LOCAL INFILE) to load from a tab-delimited (or other delimited) file. See docs for details.

LOAD DATA isn't subject to the max_allowed_packet limit.

Comments

0

Try to use LoadFile() or Convert yor Data into XML file and then use Load and Extract() function to Load Data into MySQL database.

This is the One Query and Fastest option,

Even I'm doing the same,I had files if 1.5 GB around millions of rows. I have Used Both Option in my case.

Comments

-1

You can't as far as I know. You will need a loop.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.