7

I am trying to import the large data set json file into mongodb using mongoimport.

mongoimport --db test --collection sam1 --file 1234.json --jsonArray

error:
2014-07-02T15:57:16.406+0530 error: object to insert too large
2014-07-02T15:57:16.406+0530 tried to import 1 objects
6
  • What's your file look like? Commented Jul 2, 2014 at 18:43
  • 3
    Mongo has a 16Mb limit for a single JSON doc. You've likely got too much data. (Or shouldn't be using --jsonArray.) Commented Jul 2, 2014 at 18:49
  • my json file size 17MB. inside single document is occupied >16MB of data. How I can parse this data other than mongoimport( cos i am already facing issue with mongoimport). Commented Jul 3, 2014 at 7:06
  • 1
    If you have JSON documents larger than 16MB, you'll need to refactor those into smaller documents. The maximum document size limit is imposed by the MongoDB server, not mongoimport. As noted in an earlier comment, you probably shouldn't be using --jsonArray as the limit for the array is 16MB; splitting that into one line per array element may fix your import issue. Commented Jul 5, 2014 at 13:41
  • @MoreThanFive do you know how to do it. please check this question. stackoverflow.com/questions/33034785/… Commented Oct 13, 2015 at 7:01

1 Answer 1

13

Please try add this option: --batchSize 1

Like:

mongoimport --db test --collection sam1 --file 1234.json --batchSize 1

Data will be parsed and stored into the database batchwise

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.