4

I have some Perl scripts on a unix-based server, which access a common text file containing server IPs and login credentials, which are used to login and perform routine operations on those servers. Currently, these scripts are being run manually at different times.

I would like to know that if I cron these scripts to execute at the same time, will it cause any issues with accessing data from the text file (file locking?), since all scripts will essentially be accessing the data file at the same time?

Also, is there a better way to do it (without using a DB - since I can't, due to some server restrictions) ?

2
  • 2
    There's no risk in reading the same file simultaneously. The risk is in reading something that is being written to, or writing simultaneously. See perldoc -f flock: You probably want to obtain a LOCK_SH on each instance that reads, and if anyone is writing, the writer should obtain a LOCK_EX. Commented Jan 28, 2013 at 6:13
  • 1
    It sounds like you want an SQLite database. No server restrictions I can think of would stop you from using SQLite, as it doesn't require a server process. Commented Jan 28, 2013 at 6:16

1 Answer 1

1

It depends on which kind of access.

There is no problem in reading the data file from multiple processes. If you want to update the data file while it could be read, it's better to do it atomically (e.g. write a new version under different name, than rename it).

Sign up to request clarification or add additional context in comments.

2 Comments

Ok great, that's what I needed to know. I will only be reading the file and not updating it. Thanks a lot!
Your update solution only works if there's only one updater at any time. Otherwise, you have a race condition.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.