As mentioned in the title, I thinking about a dockerized jenkins. I have a running container that run all tests but now I want to run some deployment job.
The files (.py, .conf, .sh) will be copied into folders which are mounted by other container (app container). As I seen some recommend do not use docker as well.
Now I'm wondering if I should continue to use jenkins in a container (so i must find a way to run the deployment script) or prefer to install it on the server ?
We're experimenting with containerizing Jenkins in production - the flexibility of being able to easily set up or move instances offsets the learning pain, and that pain is :
1 - Some build jobs are themselves containerized, requiring that you run docker-in-docker. This is possible by passing the host docker.sock into the Jenkins' container. (more reading : https://getintodevops.com/blog/the-simple-way-to-run-docker-in-docker-for-ci). It requires that the host and Jenkins container are running identical versions of Docker, but I can live with that.
2 - SSH keys are a bigger issue. ssh agent forwarding in Docker is notorious for its unreliability, and we've always copied keys into containers (ignoring security questions for the context of this question). In an on-the-host Jenkins instance we put our ssh keys in Jenkins' home folder and everything works seamlessly. But, dockerized Jenkins has its home folder inside a Docker volume, which is owned by the host system, so keys are too open. We got around this by copying the keys to a folder outside Jenkins' home, chown/chmod'ing those keys to the Jenkins container user, then adding the key path to the container's /etc/ssh/ssh_config.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With