Jonathan Jonathan - 1 month ago 19x
Python Question

Pass commands from one docker container to another

I have a helper container and an app container.

The helper container handles mounting of code via git to a shared mount with the app container.

I need for the helper container to check for a

in the cloned code and if one exists to run
npm install
pip install -r requirements.txt
, storing the dependencies in the shared mount.
Thing is the npm command and/or the pip command needs to be run from the app container to keep the helper container as generic and as agnostic as possible.

One solution would be to mount the docker socket to the helper container and run
docker exec <command> <app container>
but what if I have thousands of such apps on a single host.
Will there be issues having hundreds of containers all accessing the docker socket at the same time? And is there a better way to do this? Get commands run on another container?


Well there is no "container to container" internal communication layer like "ssh". In this regard, the containers are as standalone as 2 different VMs ( beside the network part in general ).

You might go the usual way, install opensshd-server on the "receiving" server, configure it key-based only. You do not need to export the port to the host, just connect to the port using the docker-internal network. Deploy the ssh private key on the 'caller server' and the public key into .ssh/authorized_keys on the 'receiving server' during container start time ( volume mount ) so you do not keep the secrets in the image (build time).

Probably also create a ssh-alias in .ssh/config and also set HostVerify to no, since the containers could be rebuild. Then do

ssh <alias> your-command