AL Test Runner Pre-Release Version

TL;DR

There is now a pre-release version of the AL Test Runner extension for Visual Studio Code. It will have the latest (and possibly unstable) features.

Pre-Releases

VS Code recently added support for pre-release versions of extensions. You can install a pre-release by clicking on the “Switch to Pre-Release Version” button from the extension details within VS Code. See https://code.visualstudio.com/updates/v1_63#_pre-release-extensions for more details.

Up ’til now I have typically packaged a new version of the extension and used it myself for a week or two to check that it isn’t horribly broken before I push an update to the marketplace. Having a pre-release version will give me a better way to use the extension myself but also get feedback from anyone who is interested in being a beta tester. GitHub issues are the best place to log requests or bugs.

What’s in the Pre-Release?

There are few things which are currently in the pre-release but not in the release version.

Debug All Tests

Bit niche, but I have actually found it useful on a couple of occasions. There is an icon at the top of the Test Explorer view and a command in the command palette to debug all the tests, so I decided to add support for it in my extension.

A new version of the Test Runner Service app is required to support this. Install with the "Install Test Runner Service" command from inside VS Code or download the latest version from here: https://github.com/jimmymcp/test-runner-service/raw/master/James%20Pearson_Test%20Runner%20Service.app

Publishing Apps using PowerShell

There is a new setting to publish apps to the container using PowerShell (the bccontainerhelper module) rather than the publish command in VS Code.

Why? A couple of reasons.

  1. I can’t know whether the app has compiled and published successfully when using the AL: Publish command. If publishing the app fails then VS Code is left thinking that the tests are running when in reality they never started. You need to manually cancel the test run before you can start another from the Test Explorer. Publishing from PowerShell gives a little more control
  2. I’m toying with the idea of automating test runs in the background while developing, something along the lines that Luc suggested here: https://github.com/jimmymcp/al-test-runner/issues/42. This would require a more reliable to compile and publish the app(s) than just triggering the AL: Publish command and hoping that it worked

testRunnerCodeunitId

There is a new key in the AL Test Runner config.json file to specify the id of the test runner codeunit id to use. It defaults to the codeunit isolation runner but you can override with another if you like.

Various

Various other improvements – updated Pester tests, updated GitHub actions. Take a look on GitHub if you are interested.

Chaining Builds in Azure DevOps

We are triggering a lot of builds in Azure DevOps these days. If anyone so much as looks at an AL file we start a new build.

OK, that’s a small exaggeration, but we do use our build pipelines for:

  • Continuous integration i.e. whenever code is pushed up to Azure DevOps we start a build
  • Verifying our apps compile and run against different localisations (more of that another time)
  • Checking that a dependent app hasn’t been broken by some changes (what we’re going to talk about now)
  • Building our app against different upcoming versions of Business Central (this is an idea that we haven’t implemented yet)

Background Reading

If you haven’t got a clue what I’m talking about you might find a little background reading useful. These might get you started:

Overview

We’re considering a similar problem to the one I wrote about in the last post on package management – but from the other end. The question then was, “how do I fetch packages (apps) that my app depends on?” Although not explicitly stated, a benefit of the package management approach is that you’ll find out pretty quickly if there are any breaking changes in the dependency that you must handle in your app.

Obviously, you want to minimise the number of times you make a breaking change in the first place but if you can’t avoid it then change the major version no. and do your best to let any dependent devs know how it will affect them e.g. if you’re going to change how an API works, give people some notice…I’m looking at you Microsoft 😉

But what if we’re developing the dependency and not the dependent app? There will be no trigger to build the dependent app and check that it still works.

Chaining Builds

Azure DevOps allows you to trigger a new build on completion of another build. In our scenario we’ve got two apps that are built from two separate Git repositories in the same Azure DevOps project. One is dependent upon the other.

It doesn’t really matter for the purposes of this post what the apps do or why they are split into two but, for the curious, the dependent app provides a little slice of extra functionality for on-prem customers that cannot be supported for SaaS. Consequently the dependency (which has the core functionality supported both for SaaS and on-prem) is developed far more frequently than the dependent app.

Build Triggers.JPG

We want to check that when we push changes to the dependency that the dependent app still works i.e. it compiles, publishes, installs and the tests still run.

You can add a “Build Completion” trigger to pipeline for the dependent app. This defines that when the dependency app is built (filtered by branch) that a build for the dependency kicks off.

That way if we’ve inadvertently made some breaking change we gives ourselves a chance to catch it before our customers do.

Limitations

Currently the triggering and to-be-triggered build pipelines must be in the same Azure DevOps project – which is a shame. I’d love to be able to trigger builds across different projects in the same organisation. No doubt this would be possible to achieve through the API – maybe I’ll attempt it some day – but I’d rather this was supported in the UI.

An Approach to Package Management in Dynamics 365 Business Central

TL;DL

We use PowerShell to call the Azure DevOps API and retrieve Build Artefacts from the last successful build of the repository/repositories that we’re dependent on.

Background

Over the last few years I’ve moved into a role where I’m managing a development team more than I’m writing code myself. I’ve spent a lot of that time looking at tools and practices in the broader software development community. After all, whether you’re writing C/AL, AL, PowerShell or JavaScript it’s all code and it’s unlikely that we’ll face any challenges that haven’t already been faced in one way or another in a different setting.

In that time we’ve introduced:

Package Management

The next thing to talk about is package management. I’ve written about the benefits of trying to avoid dependencies between your apps before (see here). However, if app A relies on app B and you cannot foresee ever deploying A without B then you have a dependency. There is no point trying to code your way round the problems that avoiding the dependency will create.

Accepting that your app has one or more dependencies – and most of our apps have at least one – opens up a bunch of questions and presents some interesting challenges.

Most obviously you need to know, where can I get the .app files for the apps that I am dependent on? Is it at least the minimum version required by my app? Is this the correct app for the version of the Dynamics NAV / Dynamics 365 Business Central that I am developing against? Are the apps that I depend on themselves dependent on other apps? If so, where do I get those from? Is there another layer of dependencies below that? Is it really turtles all the way down?

These are the sorts of questions that you don’t want to have to worry about when you are setting up an environment to develop in. Docker gives us a slick way to quickly create disposable development and testing environments. We don’t want to burn all the time that Docker saves us searching for, publishing and installing app files before we can start work.

This is what a package manager is for. The developer just needs to declare what their app depends on and leave the package manager to retrieve and install the appropriate packages.

The Goal

Why are we talking about this? What are we trying to achieve?

We want to keep the maintenance of all apps separate. When writing app A I shouldn’t need to know or care about the development of app B beyond my use of its API. I just need to know:

  • The minimum version that includes the functionality that I need – this will go into my app.json file
  • I can acquire that, or a later, version of the app from somewhere as and when I need it

I want to be able to specify my dependencies and with the minimum of fuss download and install those apps into my Docker container.

We’ve got a PowerShell command to do just that.

Get-ALDependencies -Container BCOnPrem -Install

There are a few jigsaw pieces we need to gather before we can start putting it all together.

Locating the Apps

We need somewhere to store the latest version of the apps that we might depend upon. There is usually some central, public repository where the packages are hosted – think of the PowerShell Gallery or Docker Hub for example.

We don’t have an equivalent repository for AL apps. AppSource performs that function for Business Central SaaS but that’s not much use to us while we are developing or if the apps we need aren’t on AppSource. We’re going to need to set something up ourselves.

You could just use a network folder. Or maybe SharePoint. Or some custom web service that you created. Our choice is Azure DevOps build artefacts. For a few reasons:

  • We’ve already got all of our AL code going through build pipelines anyway. The build creates the .app files, digitally signs them and stores them as build artefacts
  • The artefacts are only stored if all the tests ran successfully which ought to give us more confidence relying on them
  • The build automatically increments the app version so it should always be clear which version of the app is later and we shouldn’t get caught in app version purgatory when upgrading an app that we’re dependent on
  • We’re already making use of Azure DevOp’s REST API for loads of other stuff – it was easy to add some commands to retrieve the build artefacts (hence my earlier post on getting started with the API)

Identifying the Repository

There is a challenge here. In the app.json file we identify dependencies by app name, id and publisher. To find a build – and its artefacts – we need to know the project and repository name in Azure DevOps.

Seeing as we can’t add extra details into the app.json file itself we hold these details in a separate json file – environment.json. This file can have an array of dependency objects with a:

  • name – which should match the name of the dependency in the app.json file
  • project – the Azure DevOps project to to find this app in
  • repo – the Git repository in that project to find this app in

Once we know the right repository we can use the Azure DevOps API to find the most recent successful build and download its artefacts.

I’m aware that we could use Azure DevOps to create proper releases, rather than downloading apps that are still in development. We probably should – maybe I’ll come back and update this post some day. For now, we find that using the artefacts from builds is fine for the two main purposes we use them: creating local development environments and creating a Docker container as part of a build. We have a separate, manual process for uploading new released versions to SharePoint for now.

The Code

So much for the theory, let’s look at some code. In brief we:

  1. Read app.json and iterate through the dependencies
  2. For each dependency, find the corresponding entry in the environment.json file and read the project and repo for that dependency
  3. Download the app from the last successful build for that repo
  4. Acquire the app.json of the dependency
  5. Repeat steps 2-5 recursively for each branch of the dependency tree
  6. Optionally publish and install the apps that have been found (starting at the bottom of the tree and working up)

A few notes about the code:

  • It’s not all here – particularly the definition of Invoke-TFSAPI. That is just a wrapper for the Invoke-WebRequest command which adds the authentication headers (as previously described)
  • These functions are split across different files and grouped into a module, I’ve bundled them into a single file here for ease

(The PowerShell is hosted here if you can’t see it embedded below: https://gist.github.com/jimmymcp/37c6f9a9981b6f503a6fecb905b03672)


function Get-ALDependencies {
Param(
[Parameter(Mandatory=$false)]
[string]$SourcePath = (Get-Location),
[Parameter(MAndatory=$false)]
[string]$ContainerName = (Split-Path (Get-Location) -Leaf),
[Parameter(Mandatory=$false)]
[switch]$Install
)
if (!([IO.Directory]::Exists((Join-Path $SourcePath '.alpackages')))) {
Create-EmptyDirectory (Join-Path $SourcePath '.alpackages')
}
$AppJson = ConvertFrom-Json (Get-Content (Join-Path $SourcePath 'app.json') -Raw)
Get-ALDependenciesFromAppJson -AppJson $AppJson -SourcePath $SourcePath -ContainerName $ContainerName -Install:$Install
}
function Get-ALDependenciesFromAppJson {
Param(
[Parameter(Mandatory=$true)]
$AppJson,
[Parameter(Mandatory=$false)]
[string]$SourcePath = (Get-Location),
[Parameter(Mandatory=$false)]
[string]$RepositoryName,
[Parameter(Mandatory=$false)]
[string]$ContainerName,
[Parameter(Mandatory=$false)]
[switch]$Install
)
foreach ($Dependency in $AppJson.dependencies) {
$EnvDependency = Get-DependencyFromEnvironment -SourcePath $SourcePath -Name $Dependency.name
$Apps = Get-AppFromLastSuccessfulBuild -ProjectName $EnvDependency.project -RepositoryName $EnvDependency.repo
$DependencyAppJson = Get-AppJsonForProjectAndRepo -ProjectName $EnvDependency.project -RepositoryName $EnvDependency.repo
Get-ALDependenciesFromAppJson -AppJson $DependencyAppJson -SourcePath $SourcePath -RepositoryName $RepositoryName -ContainerName $ContainerName -Install:$Install
foreach ($App in $Apps) {
if (!$App.FullName.Contains('Tests')) {
Copy-Item $App.FullName (Join-Path (Join-Path $SourcePath '.alpackages') $App.Name)
if ($Install.IsPresent) {
try {
Publish-NavContainerApp -containerName $ContainerName -appFile $App.FullName -sync -install
}
catch {
if (!($_.Exception.Message.Contains('already published'))) {
throw $_.Exception.Message
}
}
}
}
}
}
}
function Get-AppJsonForProjectAndRepo {
Param(
[Parameter(Mandatory=$true)]
[string]$ProjectName,
[Parameter(Mandatory=$false)]
[string]$RepositoryName
)
$VSTSProjectName = (Get-VSTSProjects | where name -like ('*{0}*' -f $ProjectName)).name
$AppContent = Invoke-TFSAPI ('{0}{1}/_apis/git/repositories/{2}/items?path=app.json' -f (Get-TFSCollectionURL), $VSTSProjectName, (Get-RepositoryId -ProjectName $VSTSProjectName -RepositoryName $RepositoryName)) -GetContents
$AppJson = ConvertFrom-Json $AppContent
$AppJson
}
function Get-DependencyFromEnvironment {
Param(
[Parameter(Mandatory=$true)]
[string]$SourcePath,
[Parameter(Mandatory=$true)]
[string]$Name
)
Get-EnvironmentKeyValue -SourcePath $SourcePath -KeyName 'dependencies' | where name -eq $Name
}
function Get-EnvironmentKeyValue {
Param(
[Parameter(Mandatory=$false)]
[string]$SourcePath = (Get-Location),
[Parameter(Mandatory=$true)]
[string]$KeyName
)
if (!(Test-Path (Join-Path $SourcePath 'environment.json'))) {
return ''
}
$JsonContent = Get-Content (Join-Path $SourcePath 'environment.json') -Raw
$Json = ConvertFrom-Json $JsonContent
$Json.PSObject.Properties.Item($KeyName).Value
}
function Get-VSTSProjects {
(Invoke-TFSAPI -Url ('{0}_apis/projects?$top=1000' -f (Get-TFSCollectionURL))).value
}
function Get-RepositoryId {
Param(
[Parameter(Mandatory=$true)]
[string]$ProjectName,
[Parameter(Mandatory=$false)]
[string]$RepositoryName
)
$Repos = Invoke-TFSAPI ('{0}{1}/_apis/git/repositories' -f (Get-TFSCollectionURL), $ProjectName)
if ($RepositoryName -ne '') {
$Id = ($Repos.value | where name -like ('*{0}*' -f $RepositoryName)).id
}
else {
$Id = $Repos.value.item(0).id
}
if ($Id -eq '' -or $Id -eq $null) {
$Id = Get-RepositoryId -ProjectName $ProjectName -RepositoryName ''
}
$Id
}

Working with Version Numbers in Dynamics Business Central / NAV

Specifically I’m talking about assigning version numbers to your own code and manipulating those versions in CAL / AL and PowerShell.

Version Numbering

There are lots of different systems for assigning a version number to some code. Some incorporate the date or the current year and day number within the year. Loads of background reading here if you’re interested.

The system we typically follow is:

Version number = a.b.c.d where:

  • a = major version – this is only incremented for a major refactoring or complete rewrite of the software
  • b = minor version – incremented when a significant new feature is implemented
  • c = fix – incremented for small changes and bug fixes
  • d = build – set to the ID of the build that created it in Azure DevOps

This system isn’t perfect and we don’t always follow it exactly as written. The line between what is just a fix and what is a new feature is a little blurry. We don’t run CAL code through our DevOps build process so they don’t get a build ID like AL apps do. Hit the comments section and tell me how and why you version differently.

Regardless, the important thing is you give some consideration to versioning. It is especially important that two different copies of your code must not go out to customers having the same version number. This is especially true for AL apps. If you want to publish an updated version of an app it must have a higher version number than the one you are replacing.

Automation

There are several situations where we need to work with version numbers in code and in scripts.

  • In the build process – reading the current version from app.json and setting the last element to equal the build ID
  • In our PowerShell script that creates a new navx package from CAL code (yes, we use v1 extensions. Not now, let’s go into that some other time)
  • In upgrade code – what was the previous version of the app? Was it higher or lower than a given version?

If you are considering, like we used to, just treating version numbers as strings…don’t. Think about it:

Treated as versions 1.10.0 is greater than 1.9.0 but when treated as strings it isn’t. That led us to split the versions into two arrays and compare each element. It worked, but it was convoluted. And completely unnecessary.

Some bright spark in our team wondered why we can’t just use .Net’s version type. We can.

CAL

Use a DotNet variable of type Version. Construct it with the version number string. NAVAPP.GETARCHIVEVERSION returns a string that can be used.

You can use the properties of the variable to access the individual elements of the version and its methods to compare to another string (less than, less than or equal to, greater than, greater than or equal to).

Version : DotNet System.Version.'mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'
Version2 : DotNet System.Version.'mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'

Version.Version('1.10.0');
Version2.Version(NAVAPP.GETARCHIVEVERSION);

IF (Version2.op_LessThan(Version) THEN BEGIN
  //some upgrade code that must be run when coming from an older version than 1.10.0
END;

PowerShell

Declare a variable of a given DotNet type using square brackets. Create a new version with new, Parse or TryParse. The latter expects a version variable passed by reference and returns a Boolean indicating whether a value could be assigned.

Access the elements of the version through the properties of the variable.

C:\> $Version1 = [Version]::new(1,10,0)
>> $Version2 = [Version]::new('1.9.0')
>> $Version1.CompareTo($Version2)
1

C:\> $Version = [Version]::new(1,10,0)
>> $Version.Minor
10

C:\> $Version = [Version]::new()
>> [Version]::TryParse('monkey',[ref]$Version)
False

AL

AL has a native Version datatype. As above, create a new version either from its elements or from a string. NavApp.GetArchiveVersion returns a string that can be used (for migration from v1).

To get the version of the current module (app) or of another app use NavApp.GetCurrentModuleInfo or NavApp.GetModuleInfo.

var
  Ver : Version;
  Ver2 : Version;
  DataVer : Version;
  AppVer : Version;
  ModInfo : ModuleInfo;
  ModInfo2 : ModuleInfo;
begin
  Ver := Version.Create(1,10,0);
  Ver2 := Version.Create(NavApp.GetArchiveVersion());

  if Ver > Ver2 then begin
    //some upgrade code
  end;

  //version of the current app
  NavApp.GetCurrentModuleInfo(ModInfo);
  DataVer := ModInfo.DataVersion();
  AppVer := ModInfo.AppVersion();

  //app version of the first dependency
  NavApp.GetModuleInfo(ModInfo.Dependencies().Get(1).Id(),ModInfo2); //dependencies is 1 based, not 0 based
  AppVer := ModInfo2.AppVersion();
end;

Part 3: Integration Between Extensions in Dynamics 365 Business Central

Trig Calculator.gif

Sample Code: https://github.com/jimmymcp/calculator-interface

This post is in a series (parts one and two here) discussing the challenges and practical approaches to breaking your functionality into discrete extensions and getting them to integrate with one another.

In the previous post I described my attempt to declare and implement interfaces in AL with a heady mix of a discovery pattern, Codeunit.Run and manually bound subscribers. In this post I’m going to walk through an example.

The example is, of course, a calculator. Cos, sin and tan calculations will be handled by separate modules all implementing a TRIG interface and its Calculate method.

The calculator should be able to make use of any of the calculations independently of the others and it should be possible to maintain a calculation module without affecting anything else.

calculator structure.JPG

Before we start, a few things to note:

  • We can’t actually define an interface and implement it in any formal way in AL. Not in a sense that will give you a compile-time error if you don’t implement it correctly. Microsoft are aware that this is something we need and are investigating how they might bring this to AL e.g. check out the “Designing for extensibility” session at NAVTechDays 2018. This is my attempt to bring the benefits of interfaces to Business Central development until Microsoft give us something better
  • For the sake of convenience I’m using a calculator example rather than the file handler scenario I have been discussing in this series. This approach could be considered for any scenario where you have multiple, independent implementations of similar functionality
  • Also for convenience, all of the sample code is in a single app. In reality it would be split into 5 apps as per the diagram above

Registering Implementations

With all that said let’s get down to the details. The first thing is that each of the calculation modules registers themselves as an implementation of the TRIG interface.

Each module has a pair of codeunits:

  1. Binding – responsible for subscribing to the discovery event and registering the implementation and for binding an instance of the Calculation codeunit
  2. Calculation – contains the methods that actually implement the interface events, is manually bound

The below code is from the CosBinding codeunit. It adds a new entry into the Interface Implementation table to register a implementation of the TRIG interface called COS. It also specifies the codeunit to run when the COS implementation needs to be used – itself.

[EventSubscriber(ObjectType::Codeunit, Codeunit::"Interface Mgt.", 'OnRegisterInterface', '', false, false)]
local procedure OnRegisterInterface(var InterfaceImplementationBuffer: Record "Interface Implementation" temporary)
begin
  InterfaceImplementationBuffer.AddNewEntry('TRIG','COS',Codeunit::"Cos Binding",0);
end;

You’ll see the same code for the SIN and TAN implementations.

Looking Up Implementations

Now that we’ve got multiple implementations of the same interface we need some way of allowing code that requires the interface to select the appropriate implementation.

field(Operation; Operation)
{
  ApplicationArea = All;
  AssistEdit = true;
  trigger OnAssistEdit()
  var
    InterfaceImplementation: Record "Interface Implementation";
    InterfaceMgt: Codeunit "Interface Mgt.";
  begin
  if InterfaceMgt.LookupInterfaceImplementation('TRIG', InterfaceImplementation) then
    Operation := InterfaceImplementation."Implementation Code";
end;
}

The Operation field on the Calculator page allows the user to select the operation they want to perform i.e. which implementation of the TRIG interface to use in the calculation.

The Interface Mgt. codeunit provides a lookup of the implementations that have been registered for a given interface and returns the selected record.

Invoking Interface Methods

Now we’ve registered the implementations and selected the specific one we want to use it’s time to actually invoke it.

action(Calculate)
{
  ApplicationArea = All;
  Image = Calculate;
  Promoted = true;
  PromotedCategory = Process;
  PromotedOnly = true;

  trigger OnAction()
  var
    InterfaceMgt: Codeunit "Interface Mgt.";
    AppIntegrationData: Codeunit "App Integration Data";
    Handled: Boolean;
  begin
    AppIntegrationData.SetIntegationData('Angle', Angle);
    InterfaceMgt.InvokeInterfaceEvent('TRIG', Operation, 'Calculate', AppIntegrationData, Handled);
    if Handled then
      Result := AppIntegrationData.GetIntegrationDataDecimal('Result', 0)
  end;
}

I’m using a instance of the App Integration Data codeunit as a container for the data that needs to be passed between the implementation codeunit and the codeunit that is calling it. In my case I just need to pass in an angle and retrieve the result of the calculation.

InvokeInterfaceEvent tells the Interface Mgt. codeunit to invoke the Calculate method in the TRIG interface and the implementation selected in the Operation field. The instance of App Integration Data is passed in along with a Handled flag.

If the event has been handled then retrieve the value of the Result variable – as a decimal – from the App Integration Data codeunit.

And that’s it.

InvokeInterfaceEvent

So how does the appropriate Calculation codeunit get called?

This is the InvokeInterfaceEvent method.

procedure InvokeInterfaceEvent(InterfaceCode: Code[20]; ImplementationCode: Code[20]; EventName: Text; var IntegrationData: Codeunit "App Integration Data"; var Handled: Boolean)
begin
  Clear(InterfaceCodeunit);
  if not GetInterfaceImplementation(InterfaceCode, ImplementationCode, InterfaceImplementation) then
    Error(NoInterfaceImplementationErr, InterfaceCode);

  InterfaceImplementation.TestField("Codeunit ID");
  Codeunit.Run(InterfaceImplementation."Codeunit ID");
  if not InterfaceCodeunit.IsCodeunit() then
    Error(NoInterfaceCodeunitErr, InterfaceImplementation."Codeunit ID", InterfaceImplementation."Interface Code", InterfaceImplementation."Implementation Code");

  OnInterfaceEvent(EventName, IntegrationData, Handled);
  Clear(InterfaceCodeunit);
end;

First, check that a valid interface and implementation have been specified and throw an error if not.

Then test that a Codeunit ID has been specified by the selected implementation and run that codeunit. As we saw above, when registering the implementation the (Cos/Sin/Tan)Binding was specified as the codeunit to run. That codeunit is responsible for binding an instance of the correct (Cos/Sin/Tan)Calculation codeunit and passing that instance back to the Interface Mgt. codeunit (see below).

The InovkeInterfaceEvent has a global InterfaceCodeunit variable which keeps that bound codeunit instance in scope ready to respond to the OnInterfaceEvent event call.

Before calling OnInterfaceEvent we check that the InterfaceCodeunit variable does actually contain a codeunit.

After the OnInterfaceEvent call the InterfaceCodeunit is cleared to dispose of the bound codeunit and ensure it doesn’t respond to any more events until we need it again.

Binding Codeunit OnRun

This is the OnRun trigger of the CosBinding codeunit. All it does it bind an instance of the corresponding Calculation codeunit and pass that instance back to Interface Mgt.

trigger OnRun()
var
  InterfaceMgt : Codeunit "Interface Mgt.";
  CosCalculation : Codeunit "Cos Calculation";
begin
  BindSubscription(CosCalculation);
  InterfaceMgt.SetInterfaceCodeunit(CosCalculation);
end;

OnInterfaceEvent

Now that we have a instance of the appropriate Calculation codeunit bound it will respond to the OnInterfaceEvent event and we can run whatever business logic we want.

Here is the CosCalculation codeunit. It:

  1. Subscribes to OnInterfaceEvent
  2. Has a case statement to handle the event that has been called (in real life an implementation will likely implement multiple methods)
  3. Reads the Angle variable from the App Integration Data codeunit
  4. Uses System.Math to calculate the result
  5. Stores the result in the Result variable in the App Integration Data codeunit
  6. Sets Handled to true
local procedure Calculate(var AppIntegrationData : Codeunit "App Integration Data")
var
  Math : DotNet Math;
  Angle : Decimal;
  Result : Decimal;
begin
  Angle := AppIntegrationData.GetIntegrationDataDecimal('Angle',0);
  Result := Math.Cos(Angle);
  AppIntegrationData.SetIntegationData('Result',Result);
end;

[EventSubscriber(ObjectType::Codeunit, Codeunit::"Interface Mgt.", 'OnInterfaceEvent', '', false, false)]
local procedure OnInterfaceEvent(EventName: Text; IntegrationData: Codeunit "App Integration Data"; var Handled: Boolean)
begin
  case EventName of
    'Calculate':
      begin
        Calculate(IntegrationData);
        Handled := true;
      end;
  end;
end;

Conclusion

And there you have it. Provided you can live with the shared dependency at the bottom of the dependency tree this achieves the two objectives that we set out with:

  1. Splitting functionality into multiple, discrete apps that can be developed and maintained independently of each other
  2. Having those apps integrate with each other to provide the required functionality to the end user

It’s not the most elegant solution and coding this way means you don’t get much help from the IDE. If you mistype a variable or event name somewhere everything will compile but nothing will work.

Hopefully at some point Microsoft will give us a better solution to these challenges but in the mean time take as much or as little inspiration from our approach as you like.