Pre-Releases & GitHub Actions for Visual Studio Code Extensions

Intro

This post is going to be a bit of a brain dump about developing my VS Code extension, branching strategy for pre-releases and releases and using GitHub actions to stitch it all together.

If you’re only here for the AL / Business Central content then you might want to give this one a miss. Then again, Microsoft are increasingly using GitHub for AL projects themselves (e.g. AL-Go for GitHub) – so it might be worth a look after all.

Objectives

What am I trying to achieve? I want to have a short turn around of:

  1. Have an idea for a new feature
  2. Implement the feature
  3. Test it and make it available for others to test
  4. Release

I use the extension pretty much every day at work so I am my own biggest customer. I want to write some new feature and start working with it in a pre-release myself to find any issues before I release it.

I also want to have a little fun with a side-project – learn a little typescript, practice some CI/CD, GitHub Actions and Application Insights. If anyone else finds the extension useful as well then that’s a bonus.

Overview

This is my workflow. I want to get the feature into the pre-release version of the extension on the marketplace quickly. That way I will get the new pre-release myself from the marketplace and use it in my daily work. I’ll make any fixes or improvements in updates to the pre-release before merging the code to the release version and publishing to the marketplace.

GitHub Actions

The GitHub actions definition is fairly self-explanatory. The yaml is bellow, or here if you prefer. Run whenever some code is pushed. Build, test, package with npm and vsce. Run the PowerShell tests with Pester. Upload the built extension as an artifact. If the pre-release branch is being built then use vsce to publish to the marketplace with the --pre-release switch.

The actions definition in the master branch is similar but publishes to the marketplace without the --pre-release switch.

name: CI

# Controls when the action will run. Triggers the workflow on push or pull request
# events but only for the master branch
on:
  push:
  pull_request:
    branches: [ master ]
  workflow_dispatch:

jobs:
  build:
    runs-on: windows-latest

    steps:
      # Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
      - uses: actions/checkout@v2

      - name: npm install, build and test
        run: |
          npm install
          npm run build
          npm test
      - name: package with vsce
        run: |
          npm install -g vsce
          vsce package
      - name: run pester tests
        shell: pwsh
        run: |
          Set-PSRepository psgallery -InstallationPolicy Trusted
          Install-Module Pester
          Install-Module bccontainerhelper
          gci *ALTestRunner.psm1 -Recurse | % {$_.FullName; Import-Module $_.FullName}
          Invoke-Pester
      - name: Upload a Build Artifact
        uses: actions/upload-artifact@v2.1.4
        with:
          name: AL Test Runner
          path: ./*.vsix

      - name: Publish to marketplace
        if: github.ref == 'refs/heads/pre-release'
        run: |
          vsce publish -p ${{ secrets.VSCE_PAT }} --pre-release

The personal access token for my Visual Studio account (used to publish to the marketplace) is stored in a repository secret.

Repository secrets

You can create and update these from the settings for the repository. You can read more about creating the personal access token and the option for publishing extensions to the marketplace here: https://code.visualstudio.com/api/working-with-extensions/publishing-extension

Conclusions

It is rewarding to make some changes to the extension, push them to GitHub and then 10-15 minutes later be able to use them in a new version of the extension which has been automatically published, downloaded and installed. It allows you to publish more frequently and with more confidence.

Scheduling Azure DevOps Pipelines with YAML

I had the pleasure of presenting some thoughts about developing apps for SaaS with James Crowter to the Dutch Dynamics Community yesterday. We were sharing some of our experiences of the maintenance challenge that comes with having published apps on AppSource.

How can you continuously test your apps against past, current and upcoming versions of Business Central? Perhaps two ways:

  1. Slowly drive yourself to despair with the monotony of creating different versions of Business Central environments and testing manually
  2. Automate as much of the tedious infrastructure and repetitive testing work as possible so you can concentrate on some fun stuff instead

We have two main reasons to trigger the execution of the pipeline for a given branch of an app in Azure DevOps:

  1. We have changed some code
  2. Microsoft have changed some code that we depend on

If we have changed some of our own code we should run it through the pipeline to ensure that it passes our checks, the automated tests run and that the resulting .app files are versioned and signed correctly. It is easy to overlook some of these tasks and/or inadvertently break some existing functionality when making our changes. The pipeline is there to have our back.

At the same time, Microsoft are making changes to the base and system applications that we rely on. Even if we don’t have any planned changes for our apps we may need to make some code changes to accommodate what Microsoft have done to the ground underneath our feet.

With a bit of luck we’ll see this sort of thing:

warning AL0432: Method 'FilterReservFor' is marked for removal. Reason: Replaced by ProdOrderLine.SetReservationFilters(FilterReservEntry)

warning AL0432: Method 'CreateReservEntryFor' is marked for removal. Reason: Replaced by CreateReservEntryFor(ForType, ForSubtype, ForID, ForBatchName, ForProdOrderLine, ForRefNo, ForQtyPerUOM, Quantity, QuantityBase, ForReservEntry)

We’re using a method that Microsoft are making obsolete and will be removed at some point in the future. No need to panic, but be aware that you should switch to the new method. Very civilised. Thanks.

With less luck we’ll find that Microsoft have introduced a change that breaks our app in some way – with a compilation error or unintended behaviour. Either way, it’s something that we want to know about.

Scheduling pipelines can help with that.

Typically we:

  • Develop against a W1 version of the latest sandbox image, run pipelines against our latest commits against mcr.microsoft.com/businesscentral/sandbox with a continuous integration trigger
  • Migrate changes backwards to BC14 and BC13 compatible versions of our apps, run pipelines against appropriate Docker images for those versions
  • Have separate branches which we rebase onto the latest commit to run pipelines against bcinsider.azurecr.io/bcsandbox and bcinsider.azurecr.io/bcsandbox-master with a schedule

The continuous integration trigger is straightforward enough. At the top of our .azure-pipelines.yml we have:

trigger:
  - '*'

The schedule is defined in a separate section of the yml file, like this:

schedules:
  - cron: 0 3 * * Sun
    displayName: Schedule insider builds
    branches:
      include: ['build/insider', 'build/insider-master']
    always: true

Those branches are the ones that are set to build against the insider Docker images. I hadn’t come across cron before, but it’s pretty simple. The schedule is defined as:

  • Minute
  • Hour
  • Day of month
  • Month
  • Day of week

Our schedule comes out as 03:00 every Sunday. Asterisks stand for any value. https://crontab.guru/ is useful for getting your head around the format.

The branches key defines which branches are included in the schedule and the always indicates that we always want to run the pipeline, even if there haven’t been any code changes since it was last run.

You Can Ditch Our Build Helper for Dynamics 365 Business Central

I’m a bit of a minimalist when it comes to tooling, so I’m always happy to ditch a tool because its functionality can be provided by something else I’m already using.

In a previous post I described how we use our Build Helper AL app to prep a test suite with the test codeunits and methods that you want to run. Either as part of a CI/CD pipeline or to run from VS Code.

Freddy K has updated the navcontainerhelper PowerShell module and improved the testing capabilities – see this post for full details.

The new extensionId parameter for the Run-TestsInBCContainer function removes the need to prepare the test suite before running the tests. Happily, that means we can dispense with downloading, publishing, installing, synchronising and calling the Build Helper app.

The next version of our own PowerShell module will read the app id from app.json and use the extensionId parameter to run the tests. Shout out to Freddy for making it easier than ever to run the tests from the shell 👍

YAML Multi-Stage Pipelines in Azure DevOps, Stage 2

In the previous post I introduced you to multi-stage YAML pipelines. Build/Release pipelines vs. a multi-stage pipeline, enabling the preview feature (it’s still in preview at the time of writing) and an overview of the structure of the file.

Now we’ll take a more detailed look at an example multi-stage YAML file. This is geared at building apps for Business Central but the principles are transferable to any other application you are targeting. This is the example that I talked through in my recent webinar. If you’d rather view it as a gist you can see that here.

trigger:
- '*'

pool:
  name: Default

variables:
  image_name: mcr.microsoft.com/businesscentral/sandbox
  container_name: Build
  company_name: My Company
  user_name: admin
  password: P@ssword1
  license_file: C:\Users\james.pearson.TECMAN\Desktop\Licence.flf

stages:
- stage: build
  displayName: Build
  jobs:
  - job: Build
    pool:
      name: Default
    steps:
      - task: PowerShell@1    
        displayName: Create build container
        inputs:
          scriptType: inlineScript
          inlineScript: > 
            Import-Module navcontainerhelper;
            $Credential = [PSCredential]::new('$(user_name)',(ConvertTo-SecureString '$(password)' -AsPlainText -Force));
            New-NavContainer -accept_eula -accept_outdated -containerName '$(container_name)' -auth NavUserPassword -credential $Credential -image $(image_name) -licenseFile $(license_file) -doNotExportObjectsToText -restart no -shortcuts None -useBestContainerOS -includeTestToolkit -includeTestLibrariesOnly -updateHosts
      - task: PowerShell@1
        displayName: Copy source into container folder
        inputs:
          scriptType: inlineScript
          inlineScript: >
            $SourceDir = 'C:\ProgramData\NavContainerHelper\Extensions\$(container_name)\Source';
            New-Item $SourceDir -ItemType Directory;
            Copy-Item '$(Build.SourcesDirectory)\*' $SourceDir -Recurse -Force;
      - task: PowerShell@1
        displayName: Compile app
        inputs:
          scriptType: inlineScript
          inlineScript: >
            Import-Module navcontainerhelper;
            $SourceDir = 'C:\ProgramData\NavContainerHelper\Extensions\$(container_name)\Source';
            $Credential = [PSCredential]::new('$(user_name)',(ConvertTo-SecureString '$(password)' -AsPlainText -Force));
            Compile-AppInNavContainer -containerName '$(container_name)' -appProjectFolder $SourceDir -credential $Credential -AzureDevOps -FailOn 'error';
      - task: PowerShell@1
        displayName: Copy app into build artifacts staging folder
        inputs:
          scriptType: inlineScript
          inlineScript: >
            $SourceDir = 'C:\ProgramData\NavContainerHelper\Extensions\$(container_name)\Source';        
            Copy-Item "$SourceDir\output\*.app" '$(Build.ArtifactStagingDirectory)'
      - task: PowerShell@1
        displayName: Publish and install app into container
        inputs:
          scriptType: inlineScript
          inlineScript: >
            Import-Module navcontainerhelper;        
            Get-ChildItem '$(Build.ArtifactStagingDirectory)' | % {Publish-NavContainerApp '$(container_name)' -appFile $_.FullName -skipVerification -sync -install}
      - task: PowerShell@1
        displayName: Run tests
        inputs:
          scriptType: inlineScript
          inlineScript: >
            $Credential = [PSCredential]::new('$(user_name)',(ConvertTo-SecureString '$(password)' -AsPlainText -Force));
            $BuildHelperPath = 'C:\ProgramData\NavContainerHelper\Extensions\$(container_name)\My\BuildHelper.app';
            Download-File 'https://github.com/CleverDynamics/al-build-helper/raw/master/Clever%20Dynamics_Build%20Helper_BC14.app' $BuildHelperPath;
            Publish-NavContainerApp $(container_name) -appFile $BuildHelperPath -sync -install;
            $Url = "http://{0}:7047/NAV/WS/{1}/Codeunit/AutomatedTestMgt" -f (Get-NavContainerIpAddress -containerName '$(container_name)'), '$(company_name)';
            $AutomatedTestMgt = New-WebServiceProxy -Uri $Url -Credential $Credential;
            $AutomatedTestMgt.GetTests('DEFAULT',50100,50199);
            Import-Module navcontainerhelper;
            $ResultPath = 'C:\ProgramData\NavContainerHelper\Extensions\$(container_name)\my\Results.xml';        
            Run-TestsInBcContainer -containerName '$(container_name)' -companyName '$(company_name)' -credential $Credential -detailed -AzureDevOps warning -XUnitResultFileName $ResultPath -debugMode
      - task: PublishTestResults@2
        displayName: Upload test results    
        inputs:
          failTaskOnFailedTests: true
          testResultsFormat: XUnit
          testResultsFiles: '*.xml'
          searchFolder: C:\ProgramData\NavContainerHelper\Extensions\$(container_name)\my

      - task: PublishBuildArtifacts@1
        displayName: Publish build artifacts
        inputs:
          ArtifactName: App Package
          PathtoPublish: $(Build.ArtifactStagingDirectory)

      - task: PowerShell@1
        displayName: Remove build container
        inputs:
          scriptType: inlineScript
          inlineScript: >
            Import-Module navcontainerhelper;
            Remove-NavContainer $(container_name)
        condition: always()
- stage: Release
  displayName: Release
  condition: and(succeeded(), eq(variables['build.sourceBranch'], 'refs/heads/master'))
  jobs:
  - deployment:
    displayName: Release
    pool:
      name: Default
    environment: Release
    strategy:
      runOnce:
        deploy:
          steps:               
            - task: PowerShell@1
              displayName: Copy artifacts to release directory                  
              inputs:
                scriptType: inlineScript
                inlineScript: >
                  $Path = Split-Path '$(System.ArtifactsDirectory)' -Parent;
                  $Artifact = "$Path\App Package\*.app";
                  Copy-Item $Artifact 'C:\Release\';

Build

I’ve got a simple Business Central app. I want to use the Build stage to take my source AL code and:

  • Compile it into an .app file
  • Publish and install it into a container
  • Load the test codeunits and methods into the DEFAULT test suite
  • Run the tests
  • Upload the test results to the build
  • Upload the .app file as a build artifact

Almost all of these steps are performed with a PowerShell task. I won’t talk you through the PowerShell, you can read the code in the steps and read more about the use of the navcontainerhelper module on Freddy’s blog or dig around the PowerShell posts on my blog. I will just mention that loading the test codeunits relies on our AL Build Helper app as described previously.

The final step to run is a PowerShell script to remove the container that has been created for the build. I always want this step to run, even if another step above it has failed. See the condition: always() line that takes care of that.

Release

Space, Rocket, Travel, Science, Sky, Abstract, Planet

So far this is fairly familiar territory and stuff that we’ve been through before. The more interesting part for the purposes of this post is having a second stage to run.

First notice the condition attached to the Release stage:

and(succeeded(), eq(variables['build.sourceBranch'], 'refs/heads/master'))

build.SourceBranch is one of the built-in variables that you can access in the pipeline which holds the name of the branch that triggered the build. This condition means the Release stage will only run if the pipeline has succeeded up to this point and the pipeline was triggered by the master branch.

This is useful for a CI/CD scenario where you want to trigger the pipeline for changes to any branch (and why would you not?) but only want to Release the code when it is merged back into the master branch.

My pipeline uses a specific type of deployment job which allows you to target a particular environment. The ‘Environments’ menu item is displayed when you enable the multi-stage pipelines preview feature. Including a deployment job in your pipeline will instruct Azure DevOps to automatically download the artifacts from the Build stage to the agent (see here).

The Release stage can include as many steps as you need depending on your definition of ‘releasing’ your software. It might include steps to:

  • Use PowerShell to publish the .app file to an on-prem instance of Business Central
  • Use the admin API to publish the .app file to a Business Central SaaS tenant
  • Upload the .app file to some other location: FTP, SharePoint, network path

As a very simple example my Release stage is simply going to copy the .app file that was downloaded as an artifact from the Build stage into a C:\Release folder on the build agent.

Environments

One of the good things about creating an environment and targeting it with a deployment job is that you can see at a glance which version of your software the environment is hosting.

Drill down on the environment to see details of the current and previous versions that have been deployed and all the relevant corresponding detail – the pipeline that deployed the software, the logs, commits and work items. Beautiful.

YAML Multi-Stage Pipelines in Azure DevOps, Stage 1

Let’s return to the subject of pipelines and this time let’s talk multi-stages. What is it and why might you want to implement it in your YAML file?

Builds/Releases

With the approach that Microsoft are now calling “classic” pipelines there was a definite division between a build pipeline and a release pipeline.

A build starts with a given version of your source code (a particular commit in your git repository, say) and proceeds to define the steps that should be performed on that code to “build” it.

You decide what “build” means and define the steps as you need them. In a Business Central AL extension context we’re probably talking: compile the extension into an app file, publish and install and run some tests.

A release takes artifact(s) that have been created by a particular build and/or code from a particular repository and “releases” them. Again, you define whatever “release” actually means to you. Publish an app file into a Business Central database, upload it to SharePoint, decompress the app file and send the source code to a printer – whatever you want.

Build pipelines can still be defined in the classic, visual editor or in a YAML file. The Azure DevOps interface makes it pretty obvious which way they recommend you do this. It took me a second to spot the discrete “use the classic editor” link when creating a new pipeline.

Clicking that link and successfully avoiding the top option from the following page which still creates a YAML file anyway will get you to the classic, visual editor. Select the agent that is going to run this pipeline, add one or more jobs and add one or more tasks to each job.

Even now, you’ll notice a “View YAML” link in the top right hand corner of the screen. Subtle. The term “classic” usually means something different when we’re talking about software than when we’re talking about novels. Less “masterpiece, will still be appreciated a century from now” and more “outdated, will be made obsolete and removed a few months from now”.

It’s probably a safe bet that the “classic” editor is going to go the same way as NAV’s “classic client” with its “classic reports”.

For completeness, the Release editor looks like this:

I’ve defined the build pipeline that provides the artifacts that will be released and can now define the stages and tasks involved in releasing it. At the time of writing you will still get this editor when creating a new pipelines from the Releases menu.

Multi-Stage Pipelines

Enter multi-stage pipelines. Rather than defining your build and your release tasks in separate editors you can define them in a single YAML pipeline definition.

You’ll need to enable the preview feature (from your profile menu in the top right hand corner). You’ll notice that the “Builds” option disappears from the Pipelines menu and is replaced with two new options “Pipelines” and “Environments”. Intriguing.

Now we can work with pipelines that look something like this:

trigger:
  '*'
parameters:
  image_name: a.docker.image
  container_name: my_container
stages:
- stage: build
  jobs:
  - job: Build
    pool:
      name: Default
    steps:
      (definition of the steps that are included in the build stage)
- stage: release
  condition:  and(succeeded(), eq(variables['build.sourceBranch'], 'refs/heads/master'))
  jobs:
  - deployment:
    pool:
      name: Default
    environment: QA
    (further definition of the steps involved in the release stage)

We’ll go into the details of a complete multi-stage YAML pipeline in another post. For now I just want to outline the structure of the file:

stages (1) -> stage (1..*) -> jobs (1) -> job (1..*) -> steps/deployment/tasks

You can include as many stages as you need to effectively manage the build and deployment of your software. Each stage can evaluate a condition expression which decides whether the stage should be run or not. In my case I only want to run the release stage if the pipeline has succeeded up to this point and the pipeline has been triggered by the master branch.

By default, stages will be run in series and will be dependent upon the previous stage. You can mix this up by defining the dependsOn key for each stage.

Environments

You’ll also notice that my ‘deployment’ task includes the environment to which I’m going to deploy my software. This will correspond to an environment that you have created from the Environments option of the Pipelines menu in Azure DevOps.

You can control how and when code is released to a given environment with stage conditions (as above) or with manual approvals.

Open the environment and select ‘Checks’ from the menu. All approvers that are entered on the following page must approve a pipeline before the deployment to the environment will proceed. The pipeline will be paused in the meantime.

Next time…

I hope that’s enough to whet your appetite to go and investigate the possibilities for yourself and see if/how you can start making use of this in your own development team.

Next time we’ll go through a more complete example of a multi-stage YAML pipeline and how it is put together and works. Until then you might like to check out the recording of the webinar that I did for Areopa webinars. If you like it, do them a favour and subscribe to the channel, thanks.