I work with an ancient inventory control application on Linux. It can only print to text files to communicate with other software. When I need to update a table in our PostgreSQL database I have been using psql -f text_file (where text_file is the file of SQL commands generated by our old application) This works fine but is rather slow as the psql session terminates after completing each text file. I was wondering if there is method of invoking a psql session as a background process that waits for input and does not terminate.
2 Answers
If you want to run a bunch of sql files against your database, one after the other, I would start up psql as interactive console:
psql mydatabase
And execute one file after the other with the \i command:
\i text_file
If you want to script / automate things, a fifo might be what you need. I wrote more about that here.
Or you may be interested in a coprocess like Craig Ringer describes here.
3 Comments
psql -f -?I'm not exactly sure what you're wanting. If you just have a bunch of files that need processing, use:
cat *.sql | psql -f -
If you are just wanting to continually run psql and have it run any changes in a file, something like this may work:
(
while sleep 5; do
print "\i sql_file.sql;"
cat /dev/null > sql_file.sql
done
) | psql --file -
I'm not sure how advisable this is, I've never done it myself.
2 Comments
cat route should be the simplest solution if you just want to execute a bunch of file in no particular order and do not want to check the result in between.