-1

I am looking to perform some database operations programatically inside a Django application, namely, I want to:

  • Copy Table A in Database A over to Table A (table name preserved) in Database B, then clean-up and remove Table A.

Therefore, I have a number of possible options to try and utilise:

  • I could try and call "pgdump" from the application, using some sort of system call.
  • I could potentially utilise the psycopg package.
  • I have a preference to use Django's built in with connection.cursor() as cursor

Assumptions:

  • "Table A" exists in Database A but "Table A" (both schema and data) does not exist in Database B.
  • Database A and Database B exist on different "HOSTs"

What would be the some potential methods to achieve this? I need something like this but to talk between the two databases, source and target...

CREATE TABLE [Table to copy To]
AS [Table to copy From]
WITH NO DATA;
4
  • Following thread may help you. stackoverflow.com/questions/3195125/… Commented Jul 17, 2020 at 15:02
  • @prvreddy You can't do that programitically unfortunately, user input is needed for the password. Commented Jul 17, 2020 at 15:08
  • @MichealJ.Roberts Do you need to copy table structure or structure and data? Commented Jul 17, 2020 at 15:16
  • @DanilaGanchar Structure and data Commented Jul 17, 2020 at 15:16

1 Answer 1

3

1) you can do it using dblink:

# connect to pg
# psql -U user_here etc...
-- create a few db
create database first;
create database second;

-- connect to first db and create a table with a few records
\c first;

create table users
(
    id serial not null
        constraint users_pk
            primary key,
    name varchar(20) not null
);


INSERT INTO public.users (id, name) VALUES (1, 'first');
INSERT INTO public.users (id, name) VALUES (2, 'sec');
INSERT INTO public.users (id, name) VALUES (3, 'one_more');
INSERT INTO public.users (id, name) VALUES (4, 'etc');

-- connect to second db and copy table with data
\c second;
-- dblink -- executes a query in a remote database 
create extension dblink;

-- set your creds...
CREATE TABLE users AS SELECT * FROM dblink('dbname=first user=root password=root', 'select id, name from users') as tbl(id int, name varchar(20));

-- check data:
SELECT * FROM users;

2) you can do it using pg_dump:

# generate dump with data of users table from first db
pg_dump -U root -d first --table=users --inserts > /tmp/users.dump
# run dump script on second db
psql -U root -d second < /tmp/users.dump;

3) you can do it using pandas:

import pandas as pd
from sqlalchemy import create_engine


for df in pd.read_sql(
    'SELECT * FROM users',
    con=create_engine('postgres+psycopg2://root:root@localhost:5432/first', echo=True),
    chunksize=1000
):
    df.to_sql(
        con=create_engine('postgres+psycopg2://root:root@localhost:5432/second', echo=True),
        name='users',
        index=False,
        if_exists='append'
    )
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.