0

I am building a web-based ERP application for the retail industry using PHP and MySQL. I am going to have different local databases and one on the server(same structure). What I plan to do is run this app in localhost in different stores and at the end of the day update the database on the server from different localhosts in different stores.

Remember, I would like to update the database on the server based on the sequence queries run in different databases.

Can anyone please help me with this?

Thank you.

2
  • You might want to consider setting up the remote db as a slave. Commented Mar 13, 2012 at 7:50
  • Related process is called ETL (from data warehousing) en.wikipedia.org/wiki/Extract,_transform,_load Commented Mar 13, 2012 at 7:52

2 Answers 2

1

Perhaps link to your main database from the localhost sites to begin with? No need to update at the end of the day, every change that's made to the database is simply made to the database with no "middle men", so to speak. If you need the local databases separate, run the queries on both at once?

Note: I'm unfamiliar with how an ERP application works, so forgive me if I'm way off base here.

Sign up to request clarification or add additional context in comments.

2 Comments

Thanks for your answer Douglas. Retail ERP applications also include real-time operations like sales transactions. Don't you think speed will be an issue if I run queries on locals as well as server at the same time?
I gess that depends on your server's capabilities, the number of entries going at a time, and internet connection. I'm no expert on the issue, but I'd imagine simple text updates could be run hundreds of times at once with little to no issue. If your uploading MB/GB of data, then there's more likely an issue. Personally, I'd set it up and see how well runs, but I also have only time and not much money to spend on testing/research :)
0

You may have to log every insert/update/delete sql requests in a daily file with a timestamp of your request on local databases.

Example :

 2012-03-13 09:15:00 INSERT INTO...
 2012-03-13 09:15:02 UPDATE MYTABLE SET...
 2012-03-13 09:15:02 DELETE FROM...
 ...

Then send your log files daily on main server, merge all files, sort them to keep execution order and read new file to execute request on main database.

However, it's a curious way to do thing on ERP application. A product stock information can't be merged, it's a common information, be careful with this kind of data.

You can't use autoincrement with this process, this will cause duplicate key on some request or update requests on bad records.

3 Comments

Thanks befox. The thing I had in mind is that if I provided a central database, the owner of the stores will not have to individually log in to the different localhosts, instead he can access data from the central database which will provide him with the overall information from different stores. But problems you have indicated are also of concern, like the one with the duplicate key and the product stock. Will partitioning the central database solve the issue?
If you can have a different database for each store, you can use MySQL replication system... it will keep each central store database up to date. See dev.mysql.com/doc/refman/5.1/en/replication-solutions.html
I have looked into that option as well befox, but the thing is MySQL wont support multi-master replication. If I have multiple local databases, it means I have multiple masters from which I want to replicate data on the slave database which is in the server. That is not possible I think.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.