0

I am trying to make a cache flow, that on a user request it cache a big dict of 870 records and it should stay in cache for some time. When the defined time pass on next request the dict should be updated in na cache memory.

So I have created such a function:

from django.core.cache import get_cache

def update_values_mapping():
    cache_en = get_cache('en')
    values_dict = get_values_dict() <- this make a request to obtain the dict with values
    cache_en.set_many(values_dict, 120)  # 120s for testing
    cache_en.set('expire', datetime.datetime.now() + datetime.timedelta(seconds=120))

Then in the second function I try to get values from cache

from django.core.cache import get_cache

def get_value_details(_id):
    cache = get_cache('en')
    details = cache.get(_id, {})  # Values in cache has expire date so they should eventually be gone
    expire = cache.get('expire', None)
    if not details and expire and expire < datetime.now():
        update_values_mapping()
        value = cache.get(_id, {})

    return details

During rendering a view get_value_details() is called many times to obtain all needed values.

The problem is that some of the values are missing e.g. cache.get('b', {}) return {} even if the value 'b' was saved to cache (and expire date does not pass yet). The missing values are changing, sometimes it is 'a', sometimes 'b', other time 'c' etc.

I have been testing it on LocMemCache and DummyCache so far. My example cache settings:

CACHES = {
    'default': {
        'BACKEND': 'django.core.cache.backends.locmem.LocMemCache',
        'LOCATION': 'cache-default'
    },
    'en': {
        'BACKEND': 'django.core.cache.backends.locmem.LocMemCache',
        'LOCATION': 'cache-en'
    },
    'pl': {
        'BACKEND': 'django.core.cache.backends.locmem.LocMemCache',
        'LOCATION': 'cache-pl'
    }
}

When I was playing with that in a console some of the values was disappearing from cache after next call of update_values_mapping(), but some were missing from the beginning.

Does anyone have any clue what it could be ? Or maybe how to solve described flow in another way ?

1 Answer 1

2

LocMemCache is exactly that - a local memory cache. That means it's local to the particular server process, and won't be visible either in other processes or in the console.

If you need something that is shared across all processes, you should use a proper cache backend like memcached or redis.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.