Trigger a Power Automate Flow from Business Central for a Selected Record

Intro

This is my first stop on the Power Platform learning train. As I mentioned in the first post, this seems like a significant moment to me. Over the last few versions users have had more control over the web client. Between the Personalise and Design options you can tweak a lot of the elements on a page.

If you wanted to introduce some logic you could do that through Power Automate. Trigger a flow in response to some event in Business Central, call some other service, post some data back to BC. Lovely, but all a little hidden away from the Business Central UI (with the exception of approvals which remains a confusing hybrid of Power Automate and old Business Central workflow). If you wanted to add some Power Automate into the mix and make it prominent in the UI you still needed to do some AL development.

Not any more.

Automate Menu

You might have noticed that there is an Automate menu pretty much everywhere now.

Let’s try clicking on Create a flow, see what happens. Power Automate is opened to a new flow with the Business Central trigger, “For a selected record (v3)”

You can see that there are few options for the trigger:

  • Environment: either specify a single environment that this flow can be used in or leave blank to appear in all envirnments
  • Company: specify a single company or leave blank for all
  • Page or table: to determine which page(s) the flow can be triggered from
  • Add some UI

Example: Translate Item Description

I want to add a flow to get the description of an item, translate into another language and update the record with the translation. I want the user to be able to choose the language that they want to translate into.

This is what the flow looks like:

I’ve added a text input to the trigger to allow the user to enter the language that they want to translate to. I’ve also specified that this flow applies to “TABLE27” – so the item list and item card pages.

From there on:

  • Get the item record (the “for a selected record” trigger outputs the SystemId of the record that the flow was run from)
  • Use a connection to Microsoft Translator (a free, throttled connection). Translate the item displayName (description) into the language entered by the user (the Language UI from the trigger is included as an output)
  • Update the item record with the output of the translation

Let’s test it with the “Facia Panel with display” item. I’ll enter “ja” for Japanese as the target language for translation. (You can also define a list of valid values for the user to choose from rather than having free entry).

Notes

Use the Trigger Outputs

If you want to be able to use this flow in any environment and company don’t hardcoded those values in the get/update record actions. Use the output from the trigger instead.

Constructing PATCH Request

I’m constructing the request for the update record action with this expression. This is the part that seems the least low-code to me, but maybe there is an easier way to achieve this.

json(concat('{"displayName": "', outputs('Translate_item_description')?['body'], '"}'))

Captions

The caption for the action is taken from the name of the flow.

The captions for the UI are set in the flow.

As an ISV we are always conscious of the languages that we need to support and captions that we need to translate. Unfortunately I couldn’t find anything about providing translations for these captions.

We do have the user id and name of the user that triggered the flow, so perhaps it is possible to retrieve the language of that user and adapt the UI but it definitely doesn’t look like something a consultant or citizen developer is going to do.

Personalise

When you personalise the page you can move the actions from the Automate action like other actions. I’ve dragged it into Home group here. Doesn’t look like you can choose an image for the promoted action though.

Thoughts

OK, so this is a trivial example, but even so I am impressed. You can add an action to specific pages, across all environments and their companies, with some UI, to integrate with external services without writing any AL code.

Power Automate handles most of the complexity of connecting to and authenticating with an external service for you.

For real use cases you might be more dependent on doing some API development in Business Central first to expose the correct data and bound actions, but that division makes sense to me. Handle the Business Central stuff in Business Central and handle the integration with other services in Power Platform.

How does that all hang together? How do you deploy an entire solution when that solution consists of an AL app, some Power Automate flows, maybe a Power App? What about source control? How do you manage credentials to external services when you deploy into another tenant? All good questions that I don’t have any answers to yet.

Getting Onboard with Power Platform as a Business Central Developer

Intro

One of the things that I wanted to come away with from Directions EMEA last week was a better overview of the Power Platform. What is it? What can we do with it? When should we use it? How does it fit into the overall solutions that we are designing for our Business Central customers?

If you’re anything like me then you’ve been aware that you need to get on the Power Platform train at some point – but you haven’t been quite sure how or when. You’ve been watching suspiciously from the platform wondering where this train even goes, how it gets there and how much a ticket costs. On closer inspection, it looks like the train is already full of consultants and citizen developers digitally transforming each other…*shudder*

Still, we can’t put it off forever. If Microsoft are to be believed then the vast majority of customisations in the future are going to be developed in low-code platforms.

Good news for me was that there was more Power Platform content at Directions than you could shake two sticks at. I still have some questions about how to manage Power Platform development as part of our Business Central solutions, but I’ve come away more convinced that it is going to be an important part in the future and an inclination to get on board.

Series

I haven’t been blogging for a while because, life, but I thought it might be interesting to blog my way through my Power Platform learning curve. Hopefully you’ll find something useful but at the very least you’ll be able to laugh at my fumbling attempts to make sense of it all.

Flow from Business Central

I’m going to start with something that was demoed by Microsoft in one of the Directions keynotes that struck me as very significant. We can trigger a Power Automate flow from an action on a page in Business Central.

But couldn’t we do that before? Yes, we’ve been able to trigger a flow with an HTTP trigger and call the URL from an action on a page, for example. What’s different now is that you can add a new action to specific tables and/or pages in Business Central, with some UI, to trigger a flow for a given record without any AL development.

Maybe you’re thrilled by the possibilities that this opens up. Finally you are not so dependent on developers to get stuff done. Or maybe you are horrified that anyone can add an action to a page without building an extension, adding the code to source control, running a pipeline or any tests.

I’ll show some examples and explore the possibilities next time…

Tip: Share a Git Hooks Directory Across Your Repositories

TL;DR

git config --global core.hookspath '<path to hooks directory>'

Sharing Hooks Across Repos

I posted before about using a pre-commit hook to check that I’m not committing anything that I really shouldn’t be (anything I’ve tagged with //DONOTCOMMIT).

Hooks are specified in the .git/hooks directory. That’s great, a git repository is completely contained within its parent folder, you can copy it somewhere else and all of the code, history and config come with it.

It’s not so convenient if you want to create some hooks that apply across multiple repositories though. You can just copy your hook files between all of your repos, or it turns out that there is a smarter way. Git config has a core.hookspath key. You can create a folder somewhere with the hooks that you want to apply to all repos and set this key.

Use git config --global to set the value of a key in the global config file and git config --global --list to list the config keys and their current values.

git config --global core.hookspath '<path to hooks directory>'

Spare Your Blushes with Pre-Commit Hooks

It’s Summer (at least in the northern hemisphere), hooray. You’ve booked some time off, wrapped up what you were working on as best you can, committed and pushed all your code, set your out-of-office and switched off Teams. Beautiful.

When you come back you flick through your messages to catch back up. What’s this? Some muppet commented out some vital code and pushed their changes? Who? Why?

It happens happened. That muppet was me.

There are good reasons why you might remove or add some code in your local environment but it is really important that those changes don’t end up in anyone else’s copy.

You can either:

  • Plan A: back yourself never to accidentally commit and push those changes
  • Plan B: add a pre-commit Git hook as an extra line of defense

I’ve played around with Git hooks before but still haven’t actually used them for anything serious. I think I’m going to start now.

Pre-Commit Hook

Open the (hidden) .git folder inside your repository and rename pre-commit.sample to pre-commit.

As the comments at the top of the file say, if you want to stop the commit then this script should echo some explanatory comment and return non-zero. This is mine:

if git diff --staged | grep 'DONOTCOMMIT' -qE; then
    echo "Your staged changes include DONOTCOMMIT"
    exit 1
fi

Before committing, Git looks for a pre-commit file in the hooks folder and executes it if it finds it.

git diff --staged gets a string of the changes which are staged i.e. going to be included in this commit. This string is piped to grep to match a regular expression – I’m keeping it simple and searching for the string ‘DONOTCOMMIT’ but you could get fancier if you wanted.

If DONOTCOMMIT is found in the staged changes then a message to that effect is shown and the scripts exit with 1 (which tells Git not to continue with the commit).

VS Code error dialog thrown by pre-commit hook

Next time I add or remove some code that is for my eyes only I’ll add a //DONOTCOMMIT comment alongside to remind me to undo it again when I push the code.

Execute JavaScript with WebPageViewer for Business Central

TL;DR

The WebPageViewer add-on has an overload to accept some JavaScript. You can use that to execute arbitrary script locally. WebPageViewer.SetContent(HTML: Text; JavaScript: Text);

JSON Formatting

This post starts with me wanting to format some JSON with line breaks for the user to read. It’s the response from an Azure Function which integrates with a local SQL server (interesting subject, maybe for another time). The result from SQL server is serialized into a (potentially very long) JSON string and this is the string that I want to present in a more human-readable format.

Sometimes I converge on a solution through a series of ideas, each of which slightly less bad than the previous. This was one of those times. If you don’t care about the train of thought then the solution I settled on was to use the JavaScript parameter of the WebPageViewer’s SetContent method.

If you’re still here then here are the stations that the train of thought pulled into, starting with the worst.

Requirement

Have some control on my page for the user to view the JSON returned from the Azure Function, formatted with line breaks.

1. Format at Source

Why not just add the line breaks when I am serializing the results in the C# of my Azure Functions? That way I don’t need to change anything in AL.

No, that’s dumb. That would make every response from the function larger than it needs to be just for the rare occasions when a human might want to read it. Don’t do that.

2. Call an Azure Function to Format the Result

I could have a second Azure Function to accept the unformatted result and return the formatted version. I could have a Function App which runs node.js and return the result in a couple of lines of code.

Wait, that’s absurd. Call another Azure Function just to execute two lines of JavaScript? And store the Uri for that function somewhere? In a setup table? Hard-coded? In a key vault? Seems somewhat over-engineered.

3. Create a User Control

Hang on. I’m being thick. We can execute whatever JavaScript we want in a user control. I can create a control with a textarea, or just a div, create a function to accept the unformatted JSON, format it and set the content of the div. No need to send the JSON outside of BC.

Closer, and if you want more control over how the JSON looks on screen probably the best bet. But, is it really necessary to create a user control just to execute some JavaScript? Still seems like too much work for what is only a very simple problem.

4. Use WebPageViewer

The WebPageViewer has a SetContent method (which I’ve written about before) which can accept HTML and JavaScript.

If you pass some script it will be executed when the page control is loaded. Perfect for what I need. I can just use the JSON.parse and JSON.stringify functions to read and then re-format my JSON text. I’m also wrapping it in pre tags and removing any single quotes in the text to format (because they will screw the JavaScript and I can’t be bothered to handle them properly).

The AL code ends up looks like this:

local procedure SetResult(NewResult: Text)
var
    JS: Text;
begin
    NewResult := NewResult.Replace('''', '');
    JS := StrSubstNo('document.write(''<pre>'' + JSON.stringify(JSON.parse(''%1''), '''', 2) + ''</pre>'');', NewResult);
    CurrPage.ResultsCtrl.SetContent('', JS);
end;

If you’re not using 26 single quotes in three lines of code then you’re not doing it right 😉