8

I'm new to docker, redis and any kind of networking, (I know python at least!). Firstly I have figured out how to get a redis docker image and run it in a docker container:

docker run --name some-redis -d redis

As I understand this redis instance has port 6379 available to connect to other containers.

docker network inspect bridge

   "Containers": {
        "2ecceba2756abf20d5396078fd9b2ecf0d60ab04ca6b8df5e1b631b6fb5e9a85": {
            "Name": "some-redis",
            "EndpointID": "09f0069dae3632a2456cb4d82ad5e7c9782a2b58cb7a4ee655f57b5c410c3e87",
            "MacAddress": "02:42:ac:11:00:02",
            "IPv4Address": "172.17.0.2/16",
            "IPv6Address": ""
        }

If I run the following command I can interact with the redis instance and generate key:value pairs:

docker run -it --link some-redis:redis --rm redis redis-cli -h redis -p 6379
set 'a' 'abc'
>OK
get 'a'
>"abc"
quit

I have figured out how to make and run a docker container with the redis library installed that will run a python script as follows:

Here is my Dockerfile:

FROM python:3
ADD redis_test_script.py /
RUN pip install redis 
CMD [ "python", "./redis_test_script.py" ]

Here is redis_test_script.py:

import redis
print("hello redis-py")

Build the docker image:

docker build -t python-redis-py .

If I run the following command the script runs in its container:

docker run -it --rm --name pyRed python-redis-py

and returns the expected:

>hello redis-py

It seems like both containers are working ok, the problem is connecting them both together, I would like to ultimately use python to perform operation on the redis container. If I modify the script as follows and rebuild the image for the python container it fails:

import redis
print("hello redis-py")
r = redis.Redis(host="localhost", port=6379, db=0)
r.set('z', 'xyz')
r.get('z')

I get several errors:

...
OSError: [Errno 99] Cannot assign requested address
...
redis.exceptions.ConnectionError: Error 99 connecting to localhost:6379. Cannot assign requested address.
.....

It looks like they're not connecting, I tried again using the bridge IP in the python script:

r = redis.Redis(host="172.17.0.0/16", port=6379, db=0)

and get this error:

redis.exceptions.ConnectionError: Error -2 connecting to 172.17.0.0/16:6379. Name or service not known.

and I tried the redis sub IP:

r = redis.Redis(host="172.17.0.2/16", port=6379, db=0)

and I get this error:

redis.exceptions.ConnectionError: Error -2 connecting to 172.17.0.2/16:6379. Name or service not known.

It feels like I'm fundamentally misunderstanding something about how to get the containers to talk to each other. I've read quite a lot of documentation and tutorials but as I say have no networking experience and have not previously used docker so any helpful explanations and/or solutions would be really great.

Many thanks

2 Answers 2

7

That's all about Docker networking. Fast solution - use host network mode for both containers. Drawback is low isolation, but you will get it working fast:

docker run -d --network=host redis ...
docker run --network=host python-redis-py ...

Then to connect from python to redis just use localhost as a hostname.

Better solution is to use docker user-defined bridge network

# create network
docker network create foo
docker run -d --network=foo --name my-db redis ...
docker run    --network=foo python-redis-py ...

Note that in this case you cannot use localhost but instead use my-db as a hostname. That's why I've used --name my-db parameter when starting first container. In user-defined bridge networks containers reach each other by theirs names.

Sign up to request clarification or add additional context in comments.

1 Comment

Awesome - I've got both methods to work! - Thanks so much, I spend most of this afternoon on that!!
5

Do:

  • Explicitly create a Docker network for your application, and run your containers connected to that network. (If you use Docker Compose, this happens for you automatically and you don’t need to do anything.)

    docker network create foo
    docker run -d --net foo --name some-redis redis
    docker run -it --rm --net foo --name pyRed python-redis-py
    
  • Use containers’ --name as DNS hostnames: you connect to some-redis:6379 to reach the container. (In Docker Compose the name of the service block works too.)

  • Make the locations of external services configurable, most likely using an environment variable. In your Python code you can connect to

    redis.Redis(host=os.environ.get("REDIS_HOST", "localhost"),
                port=int(os.environ.get("REDIS_PORT", "6379"))
    
    docker run --rm -it \
      --name py-red \
      --net foo \
      -e REDIS_HOST=some-redis \
      python-redis-py
    

Don’t:

  • docker inspect anything to find the container-private IP addresses. Between containers you can always use hostnames as described above. The container-private IP addresses are unreachable from other hosts, and may even be unreachable from the same hosts on some platforms.

  • Use localhost in Docker for anything, expect the specific case of connecting from a browser or other process running directly on the host (not in a container) to a port you’ve published with docker run -p on the same host. (It generally means “this container”.)

  • Hard-code host names in your code like this; it makes it hard to run the service in a different environment. (For databases in particular it’s not uncommon to run them outside of Docker or even in a hosted cloud service.)

  • Use --link, it’s outdated and unnecessary.

2 Comments

Very helpful to have some guidance on this. On the subject of best practices; is it better to make 2 separate containers, such as 1 for python and 1 for redis, OR is it better to use the .yml file to group several images together? Or is it simply a preference? How could this example be launched using a .yml? I'm happy to post this as a separate question if you prefer? Many thanks
Two separate containers is almost always right (especially because you can use the canned redis image unmodified). Putting them in a single docker-compose.yml is very routine, if you’re using that tool. The Docker documentation includes a Django and PostgreSQL example that you could adapt.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.