0

Often times I use bash scripts to add massive amounts of data to my localhost site databses, once I see that the new data is working properly in my local website I export the database from phpmyadmin and edit the sql file , granted with vim it is realtively easy to change all inserts to insert ignore and so on to prepare it to be accepted in phpmyadmin in cpanel to finaly add the data to my website. this becomes cumbersome when the database gets bigger and bigger
I am new to this and I don't know how to do this operation in a professional/optimal way. is my entire process wrong? how do you do it ?

thank you for your answers

2
  • Hi, your question seems to lack a specific purpose; are you asking for how to copy data from one database to another? Or are you specifically asking about how to turn INSERT statements in to INSERT IGNORE (search and replace in your favorite text editor or stream editing program like sed, awk, or perl)? Or are you having trouble performing the import because the .sql file is too big? Sorry I'm not quite following what exactly you need help with. Would MySQL replication work for you in this situation? Commented Apr 19, 2020 at 13:26
  • my question is how experts do this, do they really just edit the exported sql file and upload having thousands of repeated inserts just ignored. I am sorry I guess I can't explain more, feel free to remove the question if it is unclear or too broad Commented Apr 21, 2020 at 12:53

1 Answer 1

1

Ah, I think I understand better. I can't answer for any kind of specific enterprise environment, but I'm sure there are many different systems cobbled together with all sorts of creative baler twine and you could get a wide variety of answers to this question.

I'm actually working on a project right now where we're trying to keep data updated between two systems. The incoming data gets imported to a MySQL database and every now and then, new data is exported to a .sql file. Each row has an auto incrementing primary key "id", so we very simply keep track of the last exported ID and start the export from there (using mysqldump and the --where argument). This is quite simple and doesn't feel like an "enterprise-ready" solution, but it's fine for our needs. That avoids the problem of duplicated inserts.

Another solution would be to export the entire database from your development system, then through some series of actions import it to the production server while deleting the old database entirely. This could depend greatly on the size of your data and how much downtime you're willing to perform. An efficient and robust implementation of this would import to a staging database (and verify there were no import problems) before moving the tables to the "correct" database.

If you are simply referring to schema changes or very small amounts of data, then probably version control is your best bet. This is what I do for some of my database schemas; basically you start out with the base schema, then any change gets written as a script that can be run incrementally. So for instance, in an inventory system I might have originally started with a customer table, with fields for ID and name. Later I added a marketing department, and they want me to get email addresses. 2-email.sql would be this line: ALTER TABLE `customer` ADD `email` VARCHAR(255) NOT NULL AFTER `name`;. Still later, if I decide to handle shipping, I'll need to add mailing addresses, so 3-address.sql adds that to the database. Then on the other end, I just run those through a script (bonus points are awarded for using MySQL logic such as "IF NOT EXISTS" so the script can run as many times as needed without error).

Finally, you might benefit from setting up a replication system. Your staging database would automatically send all changes to the production database. Depending on your development process, this can be quite useful or might just get in the way.

Sign up to request clarification or add additional context in comments.

1 Comment

My favorite is the first one, keeping track of the highest ID of the last export, then running the next export starting from that ID. That can all be scripted pretty easily because I don't want to have to do the same work again and again and again :) Cheers

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.