1

In my Rails4.2 app a :user belongs_to :country and a :country has_many :users

In a migration I've the following script:

default_country = Country.find_or_create_by(code: 'it', name: 'Italy')

User.find_each do |user|
  country = user.country || default_country
  params_to_update = {}

  params_to_update[:country_id] = country.id unless user.country_id

  # get list of user's country areas ids
  user_country_areas_ids = Area.by_country(country).pluck(:id)

  # update user's area if current area does not belong to user's country
  params_to_update[:area_id] = country.default_area.id unless user_country_areas_ids.include?(user.area_id)

  # update user's locale with user country's default locale
  params_to_update[:locale] = country.domain.default_locale

  user.update_columns(params_to_update)
end

I see by the log that every time the variable country is used, a query it's perfomed, like:

[2018-01-03T14:04:05.663035 #17647] DEBUG -- :
Country Load (0.3ms)  SELECT  "countries".*
FROM "countries"
WHERE "countries"."id" = 1
LIMIT 1  [["id", 1]]

Every query takes around 0.3ms, which on my large DB sum up to a fairly big amount of time; is there a way to save the country object in memory to avoid hitting the DB every time it's called?

3 Answers 3

1

Could you retrieve your users in this way? This makes just two SQL queries: one for users and one for users' countries

User.includes(:country).each do |user|
Sign up to request clarification or add additional context in comments.

Comments

1

This caching is performed by the database itself, and there is no need (I even doubt it is possible) for you to handle it in your application. And since the 'id' I assume is indexed, you can be sure that the database will give you best possible execution time.

Now regarding, why I think such caching is not possible in the application code is, think from the perspective of your DB. Every time you tell it to grab country.default_area or country.id it needs to check which country you are talking about. Hence the extra query in the logs. And as I said, caching of any kind here is the responsibility of database.

As far as optimization is concerned, you can consider writing native SQL queries to grab data in one go, without using ORM, or have a look at configuration of your database to check for supported caching mechanisms.

Comments

0

You can load all countries into a hash at the begining. Even if the SQL query is fast, it still prevents the cost of creating an ActiveRecord instance, which is always expensive in CPU and memory.

default_country = Country.find_or_create_by(code: 'it', name: 'Italy')

# Store all countries into a hash. Since there are around 200 that is reasonnable.
countries = Country.all.reduce({}) do |hash, country|
  hash[country.id] = country
  hash
end

User.find_each do |user|
  # Prevent from loading a new country ActiveRecord instance
  country = countries[user.country_id] || default_country

  # ...
end

Note 1: If your are interested in performances, I have created a gem and a service to help profiling and monitoring Rails apps: https://www.rorvswild.com/ and https://github.com/BaseSecrete/rorvswild.

Note 2: I am pretty sure that you can achieve the same thing with a single SQL update thanks to sub selects. That would be the fastest way :-)

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.