0

Straight to the Qeustion ->.

The problem : To do async bulk inserts (not necessary bulk, if MySql can Handle it) using Node.js (coming form a .NET and PHP background)

Example : Assume i have 40(adjustable) functions doing some work(async) and each adding a record in the Table after its single iteration, now it is very probable that at the same time more than one function makes an insertion call. Can MySql handle it that ways directly?, considering there is going to be an Auto-update field.

In C#(.NET) i would have used a dataTable to contain all the rows from each function and in the end bulk-insert the dataTable into the database Table. and launch many threads for each function.

What approach will you suggest in this case,

Shall the approach change in case i need to handle 10,000 or 4 million rows per table? ALso The DB schema is not going to change, will MongoDB be a better choice for this?

I am new to Node, NoSql and in the noob learning phase at the moment. So if you can provide some explanation to your answer, it would be awesome.

Thanks.

EDIT : Answer : Neither MySql or MongoDB support any sort of Bulk insert, under the hood it is just a foreach loop. Both of them are capable of handling a large number of connections simultanously, the performance will largely depend on you requirement and production environment.

1 Answer 1

2

1) in MySql queries are executed sequentially per connection. If you are using one connection, your 40~ functions will result in 40 queries enqueued (via explicit queue in mysql library, your code or system queue based on syncronisation primitives), not necessarily in the same order you started 40 functions. MySQL won't have any race conditions problems with auto-update fields in that case

2) if you really want to execute 40 queries in parallel you need to open 40 connections to MySQL (which is not a good idea from performance point of view, but again, Mysql is designed to handle auto-increments correctly for multiple clients)

3) There is no special bulk insert command in the Mysql protocol on the wire level, any library exposing bulk insert api in fact just doing long 'insert ... values' query.

Sign up to request clarification or add additional context in comments.

3 Comments

Thanks for your reply, 1) As i continued to search, i read that MongoDB can handle large insertions better than MySQL, would you agree, assuming that each one of the 40 functions will be carrying 10,000 or more data records, ? i am learning it all, so i am not worrying about time or production environments at the moment, but still i would like to know the pros /cons if any.
try both and do some benchmarks. I've seen both mysql and mongo handling hundreds of thousands inserts per second
Sure, that will be fun :), Thnx for your time.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.