Thinking of practical production problems. We are using docker at work and it is great for deployment. But for dev setup it was painful.
Let me give you an example, I need three services running to be able to make changes to a package. The 3 services comes from the registry, so no docker files locally. As for docker-compose.yml it just starts the 3 services and builds the docker file for the package.
Now for the dev environment the way we are solving it is by creating a docker-compose.override.yml style. This works great isn't it ? The problem is we still need to support python2( I know, please don't judge). So we ended up with 2 dev environments one for each python version. So at the end I have two docker files for each version of python. 4 docker-compose files, one for dev and one for deployment. ( There are projects which reduced the docker-compose.yml files by adding the python-2 and 3 in the same file as different services but yes it blew up the file )
We might be overdoing this and completely wrong in the way we approached. I am open for feedback.
Let me give you an example, I need three services running to be able to make changes to a package. The 3 services comes from the registry, so no docker files locally. As for docker-compose.yml it just starts the 3 services and builds the docker file for the package.
Now for the dev environment the way we are solving it is by creating a docker-compose.override.yml style. This works great isn't it ? The problem is we still need to support python2( I know, please don't judge). So we ended up with 2 dev environments one for each python version. So at the end I have two docker files for each version of python. 4 docker-compose files, one for dev and one for deployment. ( There are projects which reduced the docker-compose.yml files by adding the python-2 and 3 in the same file as different services but yes it blew up the file )
We might be overdoing this and completely wrong in the way we approached. I am open for feedback.