1

I am using the redis-server as part of a Docker stack in a Django project that uses Celery Beat for scheduled tasks. While monitoring the processes with the htop command, I noticed that the memory used by the redis-server progressively increases over time. The increase in memory seems to be gradual and continuous. Are there recommended practices or settings that I should implement to manage the memory used by the redis-server, especially in an environment with Celery Beat?"

Docker version 24.0.7
Docker Compose version v2.21.0

local.yml

  redis:
    image: redis:6
    container_name: scielo_core_local_redis
    ports:
      - "6399:6379"


  celeryworker:
    <<: *django
    image: scielo_core_local_celeryworker
    container_name: scielo_core_local_celeryworker
    depends_on:
      - redis
      - postgres
      - mailhog
    ports: []
    command: /start-celeryworker

  celerybeat:
    <<: *django
    image: scielo_core_local_celerybeat
    container_name: scielo_core_local_celerybeat
    depends_on:
      - redis
      - postgres
      - mailhog
    ports: []
    command: /start-celerybeat

base.py

# Celery
# ------------------------------------------------------------------------------
if USE_TZ:
    # http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-timezone
    CELERY_TIMEZONE = TIME_ZONE
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-broker_url
CELERY_BROKER_URL = env("CELERY_BROKER_URL")
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-result_backend
CELERY_RESULT_BACKEND = CELERY_BROKER_URL
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-accept_content
CELERY_ACCEPT_CONTENT = ["json"]
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-task_serializer
CELERY_TASK_SERIALIZER = "json"
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-result_serializer
CELERY_RESULT_SERIALIZER = "json"
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#task-time-limit
# TODO: set to whatever value is adequate in your circumstances
CELERY_TASK_TIME_LIMIT = 5 * 60
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#task-soft-time-limit
# TODO: set to whatever value is adequate in your circumstances
CELERY_TASK_SOFT_TIME_LIMIT = 36000
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#beat-scheduler
CELERY_BEAT_SCHEDULER = "django_celery_beat.schedulers:DatabaseScheduler"
# http://docs.celeryproject.org/en/latest/userguide/configuration.html
DJANGO_CELERY_BEAT_TZ_AWARE = False

# Celery Results
# ------------------------------------------------------------------------------
# https: // django-celery-results.readthedocs.io/en/latest/getting_started.html
CELERY_RESULT_BACKEND = "django-db"
CELERY_CACHE_BACKEND = "django-cache"
CELERY_RESULT_EXTENDED = True

INFO MEMORY

# Memory
used_memory:8538978880
used_memory_human:7.95G
used_memory_rss:6425821184
used_memory_rss_human:5.98G
used_memory_peak:8610299728
used_memory_peak_human:8.02G
used_memory_peak_perc:99.17%
used_memory_overhead:1300368
used_memory_startup:811864
used_memory_dataset:8537678512
used_memory_dataset_perc:99.99%
allocator_allocated:8539119712
allocator_active:8861048832
allocator_resident:8901853184
total_system_memory:16559783936
total_system_memory_human:15.42G
used_memory_lua:32768
used_memory_lua_human:32.00K
used_memory_scripts:296
used_memory_scripts_human:296B
number_of_cached_scripts:1
maxmemory:0
maxmemory_human:0B
maxmemory_policy:noeviction
allocator_frag_ratio:1.04
allocator_frag_bytes:321929120
allocator_rss_ratio:1.00
allocator_rss_bytes:40804352
rss_overhead_ratio:0.72
rss_overhead_bytes:-2476032000
mem_fragmentation_ratio:0.75
mem_fragmentation_bytes:-2113157632
mem_not_counted_for_evict:0
mem_replication_backlog:0
mem_clients_slaves:0
mem_clients_normal:487872
mem_aof_buffer:0
mem_allocator:jemalloc-5.1.0
active_defrag_running:0
lazyfree_pending_objects:0
lazyfreed_objects:0

htop enter image description here

1 Answer 1

1

You could try setting the maxmemory option in your redis configuration:

Create redis.conf

echo "maxmemory 128M" > redis.conf

Mount volume in docker compose:

  redis:
    image: redis:6
    container_name: scielo_core_local_redis
    ports:
      - "6399:6379"
  volumes:
      - ./redis.conf:/usr/local/etc/redis/redis.conf

Ref: https://redis.io/docs/get-started/faq/#how-can-i-reduce-redis-overall-memory-usage

If this works you can configure the maxmemory to fit your case. Another option would be limiting the number of connections, by default redis can handle up to 10K connections. Read more here https://redis.io/docs/reference/clients/

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.