1

The question is pretty simple. Is there any way to dynamically limit the maximum number of rows of a Postgres database?

I didn't thought it was possible, until I saw that Heroku does it. After some research, I found out a way with triggers:

CREATE OR REPLACE FUNCTION check_number_of_row()
RETURNS TRIGGER AS
$body$
BEGIN
    -- replace 100 by the number of rows you want
    IF (SELECT count(*) FROM your_table) > 100
    THEN 
        RAISE EXCEPTION 'INSERT statement exceeding maximum number of rows for 
this table' 
    END IF;
END;
$body$
LANGUAGE plpgsql;

CREATE TRIGGER tr_check_number_of_row 
BEFORE INSERT ON your_table
FOR EACH ROW EXECUTE PROCEDURE check_number_of_row();

But this isn't quite what I need. It limits the rows of a table to a STATIC maximal number, so I would have to loop through all the tables, and it looks like a pretty messy option. And I haven't even thought on a way to set this max number dynamically.

I thought about doing daily/weekly cron job to check all the databases, count the total number of rows and see if it's within the limit, but if there is a more efficient way, I'll take it.

2
  • Welcome to StackOverflow (SO). Looking for some clarification--I'm not exactly sure what you're asking. Do you want to limit the amount of rows that are being stored in the database, or limit the amount of records returned from a database query? Commented Nov 8, 2018 at 7:41
  • @GregT. I want to limit the number of rows stored on the entire database, not only in one table, to a dynamic value. Commented Nov 8, 2018 at 11:09

2 Answers 2

2

Since SELECT count(*) is a very expensive operation, I don't think it is a good idea to run it on every data change. Besides, the row count won't tell you much about the size of the table.

My approach would be to use a trigger using pg_total_relation_size() like this:

CREATE OR REPLACE FUNCTION size_trig() RETURNS trigger
   LANGUAGE plpgsql AS
$$BEGIN
   IF pg_total_relation_size(TG_RELID) > (TG_ARGV[0])::bigint THEN
      RETURN NULL;
   ELSE
      RETURN NEW;
   END IF;
END;$$;

CREATE TRIGGER size_trig BEFORE INSERT ON big
   FOR EACH ROW
   EXECUTE PROCEDURE size_trig(819200);

That would limit the total table size including indexes and TOAST table to 100 blocks, and it is comparatively cheap.

You still have to define the trigger for all tables that you want checked, but there is no way around that short of modifying PostgreSQL.

You see that you can set the limit dynamically for each table. You could also come up with a solution that uses a lookup table for the size limit.

Sign up to request clarification or add additional context in comments.

Comments

1

I didn't thought it was possible, until I saw that Heroku does it.

This isn't entirely accurate. Database row limits for Heroku's hobby tier aren't hard capped. Put another way, when you exceed 10k or 10M rows in hobby-dev or hobby-basic there is nothing that will stop writes immediately. Heroku gives a 7 day window via email to bring the counts under the quota or they will prevent write access. If your purposes are simply to have a heads up and adjust accordingly, they have a nice system for this already.

More to your question however, considering that Heroku Postgres (especially true for hobby databases) is fairly locked down I don't think you'll have the level of control required to implement what you're after without making it messier than what you have in your question.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.