I created a new Postgres database and I began to fill the tables with data. The tables got different relationship between each another. I am using a node.js app and sequelize to handle the code first migration.
My problem is that I not so sure how should I handle the data itself in a code first approach.
Each migration script contains something simple like this :
up - creates a table, down - drops the table ( and all the data inside of it )
What i am wondering about is how should I handle all the data which being added overtime to the different tables?
At present if I ever decide to run 'db:migrate:undo' on any of the scripts it will drop the table and all the data inside of it.
I would have liked to have full support in data restoration as well, so if I ever need to go as far as doing 'db:migrate:undo:all', once i run 'db:migrate' again I will have all the previous data restored.
Is there a nice approach to achieving that? or maybe it is a bad practice?
edit: sample file in the present-
module.exports = {
up: (queryInterface, Sequelize) =>
queryInterface.createTable('LookupType', {
id: {
allowNull: false,
primaryKey: true,
type: Sequelize.UUID,
},
name: {
type: Sequelize.STRING(32),
allowNull: false,
},
createdAt: {
allowNull: false,
type: Sequelize.DATE,
},
updatedAt: {
allowNull: false,
type: Sequelize.DATE,
},
},
{
schema: 'lov'
}),
down: (queryInterface /* , Sequelize */) =>
queryInterface.dropTable(
{
tableName: 'LookupType',
schema: 'lov'
}),
};