Putting Queries to Use in Business Central

We’ve had query objects for a while now – since NAV 2013 (I think). In theory they sound great, link a bunch of tables together, aggregate some columns, get distinct values, left join, right join, cross join – executed as a single SQL query.

Why Don’t We Use Queries Much?

In practice I haven’t seen them used that much. There’s probably a variety of reasons for that:

  • We have limited control over the design and execution of the query at runtime. The design is pretty static making it difficult to create useful generic queries short of throwing all the fields from the relevant tables in – which feels heavy-handed
  • I find how the links between dataitems unintuitive
  • It isn’t easy to visualise the dataset that you are creating when writing the AL file
  • Habit – we all learnt FindFirst/Set, Repeat and Until when we first started playing with CAL development and are more comfortable working with loops than datasets. Let’s be honest, the RDLC report writing experience hasn’t done much to convert us to a set-based approach to our data

However, just because there is room for improvement doesn’t mean that we can’t find good uses for queries now. Queries are perfect for:

  • Using queries to select DISTINCT values in the dataset
  • Aggregates – min, max, sum, average, count – especially for scenarios that aren’t suited to flowfields
  • Date functions – convert date columns to day/month/year – which allows you to easily aggregate another column by different periods
  • Outer joins – it’s possible, but far more expensive, to create this kind of dataset by hand with temporary tables
  • Selecting the top X rows
  • Exposing as OData web services

It’s the last point in that list that I particularly want to talk about. We’ve been working on a solution lately where Business Central consumes its own OData web services.

What?! What kind of witchcraft is this? Why would you consume a query via a web service when you can call it directly with a query object? Hear me out…

Consuming Queries

I think you’ve got two main options for consuming queries via AL code.

Query Variable

You can define a variable of type query and specify the query that you want to run. This gives you some control over the query before you execute it – you can set filters on the query columns and set the top number of rows. Call Query.Open and Query.Read to execute the query and step through the result set.

The main downside is that you have to specify the query that you want to use at design-time. That might be fine for some specific requirement but is a problem if you are trying to create something generic.

Query Keyword

Alternatively we can use the Query keyword and execute a query by its ID. Choose whether you want the results in CSV (presumably this is popular among the same crowd that are responsible for an increase in cassette tape sales) or XML and save them either to disk or to a stream.

The benefit is that you can decide on the query that you want to call at runtime. Lovely. Unfortunately you have to sacrifice even the limited control that a query variable gave you in order to do so.

OData Queries/Pages

Accessing the query via OData moves us towards having the best of both worlds. Obviously there is significant extra overhead in this approach:

  • Adding the query to the web service table and publishing
  • Acquiring the correct URL to a service tier that is serving OData requests for your query
  • Creating the HTTP request with appropriate authentication
  • Parsing the JSON response to get at the data that you want

This is significantly more work than the other approaches – let’s not pretend otherwise. However, it does give you all the power of OData query parameters to control the query. While I’ve been talking about queries up til now almost all of this applies to pages exposed as OData services as well.

  • $filter: specify column name, operator and filter value that you want to apply to the query, join multiple filters together with and/or operators
  • $select: a comma-separated list of columns to return i.e. only return the columns that you are actually interested in
  • $orderBy: specify a column to order the results by – use in combination with $top to get the min/max value of a column in the dataset
  • $top: the number of rows to return
  • $skip: skip this many rows in the dataset before returning – useful if the dataset is too large for the service tier to return in a single call
  • $count: just return the count of the rows in the dataset – if you only want the count there is no need to parse the JSON response and count them yourself

5 thoughts on “Putting Queries to Use in Business Central

  1. Hi James!

    I tried your Query-OData approach because I needed a dynamic OrderBy. Turns out “Client requests for Query objects using OrderBy are not supported by OData web services”, so I had to use an API Page. It worked like a charm, but we had to discard this approach because we need authorization to perform the HttpRequest, and the only solution we found was hardcoding the token… Not useful for a real life scenario. Any suggestion regarding that?

    Greetings and thanks for the post!

    PS: $orderBy won’t work, it’s $orderby. It took me a while to find out because OData doesn’t complain, it just doesn’t order them 🙂

    Like

    1. Hi,

      Glad that you tried it out 👍 Yes, unfortunately dynamic orderBy is only supported on page web services, not queries.

      Regarding authentication, we have a setup page which holds a user name and web service access key that are used to call the services. It’s not perfect, but as schedule the calls to run off the job queue we have to save the credentials someone to use them later.

      Like

      1. Hi James, thanks for the quick reply!

        I see, we had considered that setup page solution, but we were concerned about the security issues of storing that kind of information in such an exposed table.

        Sorry for the off-topic, but this kind of username-webkey access to the API wouldn’t be enough if we wanted to open these APIs to an external service, right? According to Microsoft, we should use Azure Active Directory authentication (https://docs.microsoft.com/en-us/dynamics-nav/api-reference/v1.0/endpoints-apis-for-dynamics ).

        Thank you!

        Like

      2. Hi,

        Any sensitive data like this can be stored in isolated storage with encryption enabled. See the built-in IsolatedStorage functions in AL.

        No, web service access isn’t sufficient for API access but it is (still currently) permitted for authentication direct to a tenant web service (which is what we use it for).

        Like

Leave a comment