I am using redis_OM in order to make it easier to do aggregations using redisearch module. The thing is that I want to upload some data to redis everyday without keeping the one uploaded the day before, that is to say, I want either to make the old data expire or overwrite it with the new one.
I am doing it with the example of the Redis documentation and redis cloud. This is my model:
import datetime
from typing import Optional
from pydantic import EmailStr
from redis_om import HashModel
class Customer(HashModel):
first_name: str
last_name: str
email: EmailStr
join_date: datetime.date
age: int
bio: Optional[str]
This is how I upload the data:
import csv
import os
os.environ["REDIS_OM_URL"]="redis://default:TbmcFFUUjPiOakJA5RcZKV1DBNRRFV9L@redis-18508.c228.us-central1-1.gce.cloud.redislabs.com:18508/test"
from customer_model import Customer
from redis_om import Migrator
with open('customers.csv') as csv_file:
customers = csv.DictReader(csv_file)
for customer in customers:
cus = Customer(**customer)
print(f"{customer['firstName']} -> {cus.pk}")
cus.save()
# Create a RediSearch index
Migrator().run()
I know that we can make the data expire in redis using EXPIRE key seconds. But how do I make expire a whole model?
Can I somehow overwrite it? I don't think this second option is posible as each object is linked to a pk that is unique.
Thanks!