Adventures in the world of GitLab CI

Laravel, Docker, CI/CD, GitLab, PHP


I start this article with an important disclaimer - I haven't used GitLab's continuous integration so far, so this post can be seen kind of as an intro.

How does the whole thing work?

In general, each repository is already connected with the GitLab CI system. The only thing that needs to be done in order to run the CI itself is to commit a .gitlab-ci.yml file. This will cause a new CI job to be started on each subsequent commit.

The CI job is described in the .gitlab-ci.yml file. Generally speaking it contains your chosen docker image and all the commands you want it to execute.

One cool option, that GitLab gives you, is that you can enter names and values of variables in your account for later use in the .gitlab-ci.yml file. This way your sensitive data will be protected from the public. At the time of writing, the variables are entered through Project -> Settings -> Pipelines -> Secret Variables. The documentation of GitLab CI is here.

What exactly did I do?

In this first encounter with the CI implementation of GitLab I had a fairly simple task. On every single commit for a PHP and Laravel based project, the project had to be built, tested and deployed. According to different guides, documentation, etc. eventually I managed to do it with the following content in my .gitlab-ci.yml file.

        image: edbizarro/gitlab-ci-pipeline-php:alpine

                - sudo apk --update add openssh-client
                - cd laravel
                - composer install
                - composer dump-autoload
                - php artisan clear-compiled
                - php artisan cache:clear
                - php artisan route:clear
                - php artisan view:clear
                - php artisan config:clear
                - cd ..
                - mkdir -p ~/.ssh
                - eval $(ssh-agent -s)
                - '[[ -f /.dockerenv ]] && echo -e "Host *\n\tStrictHostKeyChecking no\n\n\tServerAliveInterval 30\n\n\tServerAliveCountMax 6\n\n\tPort 1022\n\n" > ~/.ssh/config'

                - ssh-add <(echo "$MY_SSH_PRIVATE_KEY")
                - rsync -a -q -z /builds/GITLAB_ACC/REPO_NAME/some_laravel_dir/ [email protected]_NAME:some_laravel_dir
                - rsync -a -q -z /builds/GITLAB_ACC/REPO_NAME/your_www_dir/ [email protected]_NAME:your_www_dir

Now a little explanation of what's what

With image, you choose which docker image to use in the CI job. There are a lot of docker images available. You can find them either in Google or at, or in some guide or documentation, where someone has recommended a cool image. It's very important to choose your image very carefully, because the different ones have different pre-installed software, scripts and configurations. There are a lot of images with very poor documentation that have insufficient information about what's inside the image. My recommendation is to use something popular, well documented and of course appropriate for the project. In the case of edbizarro/gitlab-ci-pipeline-php:alpine, the cool thing is that I have everything I need pre-installed, including the php extensions. This means that I will not have to wait to install software and compile php extensions every time, which was actually the case with other images.

The before_script section is a list of preparatory steps that are executed before the task itself. A brief description of the steps in my case is:

  • update sources list and install openssh-client
  • cd into the Laravel folder
  • I run a bunch of composer and laravel artisan commands to set up the project
  • cd back into home folder
  • I set up SSH settings - this is to remove the strict verification of the host key, to put an alive interval and to set the preferred connection port

The deploy part describes the CI task itself. The steps are:

  • I add my private ssh key
  • I start the file upload

Difficulties that I encountered and how I overcame them

I originally decided to upload via a ftp client, because for some weird reason I thought it would be better. Big mistake. Slow, difficult and problematic. I tried with lftp and ncftp. There was one pretty strange problem - not all directories and files were uploaded, only part of them, even though I used the recursive flag. I guess the problem was some kind of timeout or something, but I really had no desire to debug it, so I just went with SSH. This was way better. It's pretty fast, more secure and best of all - it works!


GitLab surprised me, I was expecting a much worse free tier CI implementation. Aside from the problems I faced, which were entirely caused by my own ignorance, everything went well. Currently, with each commit the new version is directly deployed. This saves me a ton of time, because I don't have to do it myself.

If you find yourself on the lookout for more automation, that will help you accomplish your tasks faster and more efficient, go ahead and write us a few lines about it, we can help!

Sharing is caring:

Want some more code?

Subscribe! This way you'll get tips on how to optimize, automate and manage more efficiently.