7

I would like to load data from CSV file into PostgreSQL database in Docker. I run:

docker exec -ti my project_db_1 psql -U postgres

Then I select my database:

\c myDatabase

Now I try to load data from myfile.csv which is in the main directory of the Django project into backend_data table:

\copy backend_data (t, sth1, sth2) FROM 'myfile.csv' CSV HEADER;

However I get error:

myfile.csv: No such file or directory

It seems to me that I tried every possible path and nothing works. Any ideas how can I solve it? This is my docker-compose.yml:

version: '3'

services:
  db:
    image: postgres
    environment:
      POSTGRES_USER: myuser
      POSTGRES_PASSWORD: mypassword
  django:
    build: .
    command: python3 manage.py runserver 0.0.0.0:8000
    volumes:
      - .:/code
    ports:
      - "8000:8000"
    depends_on:
      - db
3
  • Which container is myfile.csv in, django or db? Which container are you exec'ing into? django or db? Commented Oct 20, 2017 at 13:05
  • @Alasdair myfile.csv is in main directory where is also docker-compose.yml and Dockerfile of Django project. I execute \copy backend_data (t, sth1, sth2) FROM 'myfile.csv' CSV HEADER; in project_db_1 . Commented Oct 20, 2017 at 13:07
  • you did not mount any volume for the db container, the file myfile.csv is not in the db container, and you are running the command in that container. Possible solution add in docker-compose.yml volumes: - "<path_to_csv_in_local>:<path_to_csv_in_db_container>" Commented Oct 20, 2017 at 13:11

2 Answers 2

11

The easiest way is to mount a directory into the postgres container, place the file into the mounted directory, and reference it there.

We are actually mounting the pgdata directory, to be sure that the postgres data lives even if we recreate the postgres docker container. So, my example will also use pgdata:

services:
  db:
    image: postgres
    environment:
      POSTGRES_USER: myuser
      POSTGRES_PASSWORD: mypassword
    volumes:
      - "<path_to_local_pgdata>:/var/lib/postgresql/data/pgdata"

Place myfile.csv into <path_to_local_pgdata> (relative to directory containing the config or absolute path). The copy command then looks like this:

\copy backend_data (t, sth1, sth2) FROM '/var/lib/postgresql/data/pgdata/myfile.csv' CSV HEADER;
Sign up to request clarification or add additional context in comments.

4 Comments

And local_pgdata has to be in main directory?
If it is a relative path, then local_pgdata has to be relative to the main directory, yes.
In docker-compose.yml should be - "./local_pgdata:/var/lib/postgresql/data/pgdata" Thank you for your help.
What if my CSV file is in another container, how can I place it into db volume path?
1

you need to mount the path of the myfile.csv in the db container if you are running the command in that container.

you might have mounted the file only in django service.

possible docker-compose.yml

version: '3'

services:
  db:
    image: postgres
    environment:
      POSTGRES_USER: myuser
      POSTGRES_PASSWORD: mypassword
    volumes:
      - <path_to_csv_in_local>:<path_of_csv_in_db_container>
  django:
    build: .
    command: python3 manage.py runserver 0.0.0.0:8000
    volumes:
      - .:/code
    ports:
      - "8000:8000"
    depends_on:
      - db

you haven't created a mount of your db. this will act fatal once you close your database container (you will lose all your data). postgresql container stores data in /var/lib/postgresql/data. you need to mount this this path to your local system to maintain the data even if the container closes.

volumes:
  - <path_of_db_in_local_system>:/var/lib/postgresql/data

8 Comments

How can I check what is my path_of_db_in_local_system?
you can set it to anything you wish it to be (and accessible by docker service).. it will be the path where any data stored in your container (basically any model based operation that django application performs on postgres) will copy to the local system, so that you have your database intact even if the container closes due to unforeseen circumstances.
Okay, but so I should have two volumes - <path_to_csv_in_local>:<path_of_csv_in_db_container> and <path_of_db_in_local_system>:/var/lib/postgresql/data or it is the same?
not necessary if you are mounting data directory path (which should always be mounted to avoid losing data) then simply copy that excel in that mounted directory path and it will be available in the container. Suppose you use volumes: -/var/lib/postgres/:/var/lib/postgresql/data/pgdata then simple copy your csv file in /var/lib/postgres/ and it will be available in /var/lib/postgresql/data/pgdata.
ERROR: for django Cannot start service django: Mounts denied: The path /var/lib/postgres is not shared from OS X and is not known to Docker.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.