As with anything in life, automation usually means less room for error, and improvements in, well, everything. And when it comes to developing a project — having to build, test, and ensure that everyone else is properly doing so can be a chore when it’s manually done. Continuous integration, however, implies one thing. It implies that all these chores are automatically done by a back-end framework, usually hooked up to the git repository you are working on.
There are plenty of tools to choose from. A quick ‘CI tools’ google search will give you a plethora of options to chose from, but this post is going to focus on a specific heavy-hitter that has recently entered the CI sphere: Azure Pipelines.documentation
First things first, all you need to do is click that wonderful blue link at the top of the page that states ‘Start free with Pipelines’. Now the catch here is you need to have an Azure DevOps account. Which as of now, does not require a credit card. But I can only imagine that one day this changes.
The reason I say this, is in order to use Pipelines, Microsoft incidentally sets you up with a whole set of tools alongside it. Tools like ‘Test Planning’, that can be used to help manage the testing phases. You can even create Repos, and export everything from github (which ironically is still a Microsoft project) over to the DevOps toolchain. In my humble opinion, the whole thing seems quite contrived. All I want is the CI, because that’s what I was initially interested in when I clicked their link ‘Start free with Pipelines‘, but Microsoft just gives me the whole kit and caboodle. Why?
Anyways, it’s time to hook up your own github repo to a pipeline, you need to create a project on this site — at least I had to. And their super simple website acts as a GUI for this. Just click on Pipelines -> New. As of right now, you either have to have a Github Repo, or an Azure Repo to do this. All other platforms are disregarded. Personally, I don’t see the point moving over my github repo, so I connected my account with the site in a single click, (Microsoft uses OAuth to do this) and then selected the reposority I wish to add a pipeline to.
As with other products, YAML files (YAML Ain’t Markup Language) are used to connect your instructions to Microsoft’s back-end tool. The cool thing here is, Microsoft will offer a boilerplate file for you to use, and commit it straight away to the repo. And it’s a simple as that to get it started. They don’t want it to take that long, and they did a great job. Because all in all the process takes under 10 minutes.
Here’s where things get somewhat interesting. For our project, we needed to have a specific API key available when deploying the code. This is because a Visual Studio Code Extension must exist on the Azure Marketplace for other people to use it. Developing one requires that an authorized user ‘push’ the code up. This user of course, can be the Pipeline. Unlike other tools, such as TravisCI, I found adding variables into a pipeline a little troublesome. You need to go into the ‘Library’ section of the GUI. Library of what? This was a little non-intuitive for me. Then, if you want to store just a single variable you need to create group for it to exist in. This is neat in some respects, because you can include certain variables for some deployments, and omit them in others with the ‘flick of a group’ switch. However, I just want to add one darn thing.
When you gather the API key from wherever it is you get yours from, you can quickly copy and paste it into this group (which you of course have named to a proper standard). You can name the variable in this GUI as well, and call it in the YAML script using $(variable) syntax. SO long as you include the group it belongs in the script as well.
- group: the group name
A little odd, but it gets the job done.
This is the extent to which I’ve dabbled in Azure Pipelines, but according to documentation it seems to be quite robust: Pipelines has easy access to server variables, run jobs simultaneously — in different environments, build from only certain branches, and run bash scripts in Windows, Mac, or Linux environments. Pipelines are certainly a strong contestant in the world of CI. If you are already involved in the Microsoft DevOps ecosystem, it’s almost a no brain-er do to the ease of integration.