2

I am using redis_OM in order to make it easier to do aggregations using redisearch module. The thing is that I want to upload some data to redis everyday without keeping the one uploaded the day before, that is to say, I want either to make the old data expire or overwrite it with the new one.

I am doing it with the example of the Redis documentation and redis cloud. This is my model:

import datetime
from typing import Optional
from pydantic import EmailStr
from redis_om import HashModel


class Customer(HashModel):
    first_name: str
    last_name: str
    email: EmailStr
    join_date: datetime.date
    age: int
    bio: Optional[str]

This is how I upload the data:

import csv

import os
os.environ["REDIS_OM_URL"]="redis://default:TbmcFFUUjPiOakJA5RcZKV1DBNRRFV9L@redis-18508.c228.us-central1-1.gce.cloud.redislabs.com:18508/test"

from customer_model import Customer
from redis_om import Migrator

with open('customers.csv') as csv_file:
    customers = csv.DictReader(csv_file)

    for customer in customers:
        cus = Customer(**customer)

        print(f"{customer['firstName']} -> {cus.pk}")
        cus.save()

# Create a RediSearch index
Migrator().run()

I know that we can make the data expire in redis using EXPIRE key seconds. But how do I make expire a whole model?

Can I somehow overwrite it? I don't think this second option is posible as each object is linked to a pk that is unique.

Thanks!

1 Answer 1

3

If you want to expire a Customer object, you can do this with the expire method by getting the underlying redis-py connection for your Customer model. Here's an example:

from redis_om import HashModel, Migrator

# Define a customer model
class Customer(HashModel):
    first_name: str
    last_name: str
    age: int
    bio: str

# Create some customers
cust1 = Customer(
    first_name="Customer",
    last_name="One",
    age=38,
    bio="This is some text about customer 1"
)

cust2 = Customer(
    first_name="Customer",
    last_name="Two",
    age=38,
    bio="This is some text about customer 2"
)

# Persist customers to Redis, setting expiry to 60 seconds
cust1.save()
print(f"Expiring {cust1.key()}")
Customer.db().expire(cust1.key(), 60)
cust2.save()
print(f"Expiring {cust2.key()}")
Customer.db().expire(cust2.key(), 60)

# Create index
Migrator().run()

Showing the expiry set in Redis:

$ redis-cli
127.0.0.1:6379> ttl :__main__.Customer:01FY4VVCW081EHDH6S5W6JY9FD
(integer) 52
127.0.0.1:6379> ttl :__main__.Customer:01FY4VVCW133PPMVXH6HNERKXE
(integer) 46
Sign up to request clarification or add additional context in comments.

2 Comments

If at the connection, you specify a db index different from 0, the Migration().run() will fail, with an error, saying: redis_om.model.migrations.migrator.MigrationError: Creating search indexes is only supported in database 0. You attempted to create an index in database 2
Search functionalithy isn't supported in databases other than database 0. Using numbered databases and the SELECT command is not generally considered a good idea.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.