I am using JSON in my SQLAlchemy DB model:
from sqlalchemy.dialects.postgresql import JSON
class Customer(db.Model):
id = db.Column(db.Integer, primary_key=True, autoincrement=True)
custcontext_json = db.Column(JSON, default=lambda: {})
Now I have this query that is getting too slow:
customers = Customer.query.filter(
Customer.clientid == clientid,
or_(
func.lower(Customer.custcontext_json['cinfo', 'userName'].astext.cast(Unicode)).contains(searchterm.lower()),
func.lower(Customer.custcontext_json['cinfo', 'home', 'sign'].astext.cast(Unicode)).contains(searchterm.lower())
),
or_(
Customer.custcontext_json['cinfo', 'home', 'status'].astext == 'pre',
Customer.custcontext_json['cinfo', 'home', 'status'].astext == 'during',
Customer.custcontext_json['cinfo', 'lastChange'].astext.cast(DateTime) > pendulum.now('UTC').subtract(days=14)
)
).all()
Is it possible to index the table for this query. Or at least parts of it?