I currently have 1,100,000 rows in the table and it will increase over time. I am running this Postgres query on my database server. It's taking approx 5 sec to execute. How can I optimize it to make it execute faster?
Query:
select sum(cast("total_value" as float)) as "total_value", sum(cast("fob_value" as float)) as "total_fob_value"
from export
where ("total_value" != 'N/A' and "total_value" != 'N?A') and
("fob_value" != 'N/A' and "fob_value" != 'N?A') and
"product_desc" ilike '%pen%' and
("shipping_date" between '2020-07-31T13:00:00.000Z' and '2020-08-28T09:58:04.451Z');
'N/A'and'N?A'stuff is such nonsense; that's whatnullable columns are for. Don't defer fixing the data into a proper format to the point of querying; do it before insertion. Unless you have some really good reason not to...?lakhs, and converting it to a 'normal' number.