I have Ruby models which are populated from the responses of API calls in the following way:
JSON.parseconverts the response to aHash- the
Hashis passed into theinitializemethod of a class - the
initializemethod converts camelCase hash keys and assigns underscore_case instance variables - controller code works with these instances and converts back to json to send to the browser
This works fine, but some of these response objects are large. Others are arrays of large objects.
Profiling shows that this process consumes a lot of CPU (and memory, but that is less of a concern) -- which makes sense given that I create hashes in order to create objects, and the back and forth between camelCase and underscore_case happens A LOT -- so what libraries or techniques have you come across which solve this problem?
Here is an oversimplified example:
JSON response from a third party API (unlikely to change):
"{\"abcDef\": 123, \"ghiJkl\": 456, \"mnoPqr\": 789}"
Class definition (attributes unlikely to change):
class Data
attr_accessor :abc_def, :ghi_jkl, :mno_pqr
def initialize(attributes = {})
attributes.each do |key, val|
send "#{key.underscore}=".to_sym, val
end
end
def as_json
instance_variables.reduce({}) do |hash, iv|
iv_name = iv.to_s[1..-1]
v = send(iv_name) if self.respond_to?(iv_name)
hash[iv_name.camelize(:lower)] = (v.as_json(options) if v.respond_to?(:as_json)) || v
hash
end
end
end
Controller:
get '/' do
d = Data.new JSON.parse(api.get)
# ... do some work ...
content_type 'application/json'
d.to_json
end