During the development of our two mobile games Pixel Path and Colour Climb, we wanted a method of automating the compilation of our codebase for every commit to our repository. Not only that, but when the code successfully compiled, it should also run through numerous tests to ensure its robustness. With some research, we found that one of the best development practices to achieve this was Continuous Integration.
Note: I am by no means a CI developer. The information I have gathered from research resulting in the implementation I am about to discuss might not be the best or most optimal method of practising CI. However, from my experience using the current process it gets the job done and does it well.
Continuous Integration (CI) is a development practice that requires developers to commit code to a repository. With each commit then being verified by an automated build, allowing teams to detect problems early. In order to properly integrate this practise into our development cycle we needed a platform that supported it, which is when we came across GitLab.
GitLab is a free, open-source project with the core goal to assist development teams collaborating on software projects, providing helpful tools for every stage of the development lifecycle. Continuous Integration is built into the GitLab framework and has allowed us to easily implement a well-defined process for compiling, testing and building our applications in a short time, whilst ensuring code quality. If you wish to know more about GitLab, you can click here to find out more.
In order for the CI process to meet our needs within the context of a Unity project, it had several automation requirements:
Once the requirements were defined, we could then research the best methods of fulfilling them.
In GitLab, a pipeline is simply a process of jobs split into individual stages. These jobs can be either dependent or independent of previous stages and automate steps in the SDLC including building, testing and deploying code. The benefit of this automated process is it reduces the likelihood of human error and also reduces the amount of effort required by the developer to manage their code during and after being developed.
After researching on numerous forums and developer blogs during the starting phase of UnityShell, we found the best solution that addresses all the requirements highlighted in the previous section. Shown below is a snippet of our pipeline process for master within the GitLab Pipeline interface, you can see there are 3 stages to the process, Setup, Build and Test.
Example of Pipeline Passing
This stage is responsible for numerous checks and file generation required for the subsequent stages. All of these individual scripts are run in python, executed on the machine hosting the GitLab Runner and are as follows:
Once the project solution files are generated, they are then passed into the build stage.
This stage is responsible for using MSBuild to build the C# project using the solution file provided in the previous stage. This an important stage as it is responsible for verifying the code being committed is valid. In the event of the code being invalid, the stage fails and the pipeline process stops on that stage.
Example of Pipeline Failing
One of the useful tools built into GitLab is the Pipeline Job breakdown page, showing you information about a specific job, including its duration and related branch, displaying all of the console output from commands associated with that job. Being able to view this information is crucial in identifying the cause of Pipeline failures and quickly resolving any issues associated with the failure.
Once the build process is complete, we know the code is valid and compilable. Therefore we can run all the tests within the test framework of the project. In Unity, this is done through the Unity Test Runner, which has a useful CLI for running tests and outputting test results. In Unity the tests of split into Edit-Mode and Play-Mode tests:
One of the benefits of using GitLab with CI is that the results generated from the Unity Test Runner are displayed within the Merge Request, not only preventing the branch from being merged, but also notifying the developer which tests failed and the related console output associated with that test or tests.
In order to properly orchestrate a runner to perform jobs within set stages in a sequential manor, GitLab requires users to create a YML file that commands the runner to execute said tasks. YML have a wide range of flexibility which means they CAN be highly complex. For us, we mainly write our YML files to trigger python commands which handle the bulk of the execution.
stages: - setup - build - test setup: stage: setup script: # Pull the most recent version of our Utilities scripts - 'python %RUNNER_PATH%\scripts\runner\utilities_updater.py %RUNNER_PATH%' # Check the Unity version for the project, download newer version of required - 'python %RUNNER_PATH%\scripts\runner\unity_setup.py "UnityShell" %CI_PROJECT_ID% %PIPELINE_TOKEN% %CI_COMMIT_REF_NAME% %RUNNER_PATH%' artifacts: paths: # Save these files upon stage completion, these will be used in the next stage - "UnityShell/UnityShell.sln" - "UnityShell/UnityShell.csproj" - "UnityShell/UnityShell.Testing.csproj" expire_in: 2 days cache: # Cache the Library folder because we can use this again in future pipeline processes key: "Library" paths: - "UnityShell/Library/" build: stage: build script: # Using the solution file generated in the previous stage, build the project using MSBuild - '"C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\MSBuild\Current\Bin\MSBuild.exe" /t:restore "UnityShell\UnityShell.sln"' dependencies: - setup cache: key: "Library" paths: - "UnityShell/Library/" policy: pull test: stage: test script: # Run Unity in test mode and store the results in an XML to be uploaded to GitLab and displayed in the Merge Request - 'python %RUNNER_PATH%\scripts\runner\unity_test.py "UnityShell" "%RUNNER_PATH%\dependencies\runner\nunit3-junit.xslt"' artifacts: reports: junit: "UnityShell/junit-results.xml" dependencies: - setup cache: key: "Library" paths: - "UnityShell/Library/" policy: pull
The above snippet is the YAML file for the entire UnityShell project. For beginners or people just starting to read and learn YAML, this snippet can be quite daunting to look at, but it's actually a lot easier to understand than first percieved. If you look above you'll be able to make a direct correlation to the pipeline process we discussed earlier, each job fulfilling a specific purpose that individually help meet the requirements we set when first starting the project.
A frequent occurrence during the development of our Unity games was updating to a newer Unity version. The update process is usually fairly painless with only the occasional minor changes required for certain packages that don’t support the latest Unity version.
During the Setup stage of our pipeline process, if we try and generate a solution file through the command line without having the appropriate Unity version installed, the command will hang indefinitely as the automated process cannot accept or decline prompts to update the Unity version. Thus leaving the job to hang until the GitLab Runner times out the job, resulting in a failed pipeline process.
With the development of Unity Hub v2.1, we can now automate the download and installation of Unity versions through the command line interface (CLI). With this knowledge, we developed a Python script which we execute before the generate step of the Setup stage which looks at the project Unity version and checks that version against the versions the runner has. In the event that the runner doesn’t have the right version, we use this script automatically download and install the correct version through Unity Hub.
Example of Setup stage checking Unity version
Although the process of downloading and installing Unity takes time, it’s much less than the timeout of the runner, therefore the job can continue without issue once the right version has been downloaded.
When it comes to creating this pipeline process, having a list of requirements before we began was paramount, as it gave us the ability to focus our research on specific areas of GitLab's CI implementation. From there it was a case of trial and error, because there is success in failure and the more you fail the quicker you learn and the better your end result will be. That goes for anything, from a wicked automated CI implementation to programming entire systems. The main thing is to have fun with it and don't take it too serious.
Nevertheless, implementing the system was a huge learning experience for myself and the team. We came out of it knowing a hell of a lot more about structuring automated systems for CI, we could have done things better and the implementation will be improved upon in the future. As I said at the beginning, I'm no CI developer and I'm not going to pretend to be.
If you have any questions in regards to this article hit me up and I'll do my best to help!