2

I have implemented the CSV import in my rails app. Now the issue is ram consumption is too high. The total no. of records in my CSV is around 3600 and my ram consumption is going around 600MB - 800MB and it doesn't even get flushed after coming out of the import method.

I can see similar issue in roo issue list https://github.com/roo-rb/roo/issues/179

I am working on Rails 4.2.6 Ruby 2.2.4

code:

data = CSV.foreach(file.path, headers: true)

data.each do |row|

    Model.create(row ....)

    .........

end

Initially my ram consumption is:

GetProcessMem:0x7fd3083b2a30 @mb=204.52734375

Before getting out of the method:

GetProcessMem:0x7fd30ae1a7a0 @mb=289.60546875

After completing and redirecting to home page:

GetProcessMem:0x7fd2fb3913d8 @mb=629.61328125

With this much ram consumption, I cannot deploy it on heroku. This consumed ram doesn't even gets cleaned up, have to restart my server for it.

Anyone up with a solution or any alternative way to import csv.

1
  • I think you're just doing it wrong try this link to see a comparison of all methods. Last two in particular. dalibornasevic.com/posts/… Commented Feb 1, 2017 at 19:43

1 Answer 1

2

Use this gem smarter_csv and use the batching process. This will cut the import into groups and lower the amount of ram needed for the operation.

Ex. in 500 row groups

    SmarterCSV.process(file, {chunk_size: 500 }) do |array|
      array.each do |sub_array|
        if User.find_by_order_id(sub_array[:order_id]).present?
          User.update_attribute :shipping_id, sub_array[:order_id]
        end
      end
    end
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.