0

I have a 800 kb xml file on a server. I download it, and parse with a SAXParser. And then i add all item elements to my SQLLite db on the phone.

All item has 50-60 element. One insert query take about:

11-18 21:15:54.079: ERROR/448 delay_(9169): 41207 11-18 21:15:54.099: ERROR/448 delay__(9169): __ 41223

about 20-90ms , i have 500 row, and it takes 40 minutes, it is a good value for this?

How i can do it faster? Is it possible?

1 Answer 1

3

Wrap your INSERTs in transactions. By default, each INSERT is a transaction and involves writing to flash. You get better results on bulk data loads by having fewer transactions. For 500 rows, perhaps do one transaction per 100 rows or something.

Sign up to request clarification or add additional context in comments.

1 Comment

agree with the previous; also noting play with the number of rows per transaction, 100 with one insert might be fastest; 20 with another; so just fiddle.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.