Analyse your Business Central Telemetry in Microsoft Fabric

Analyse your Business Central Telemetry in Microsoft Fabric

Sometimes you want to analyse your data in PowerBI.
There is already a great PowerBI report from Microsoft that you can download. Details you can see here:
Analyze and monitor telemetry with Power BI – Business Central | Microsoft Learn

But sometimes you want to analyze and create your own Business Central telemetry. And now it is also possible to get your telemetry data inside Microsoft Fabric!

The structure will be like this:

First you need to setup an Event Hub and under the diagnostic settings of your application insights you can choose which categories you want to flow in your Event Hub.

When that is done you can create an event stream inside Microsoft Fabric. With an event stream you can flow real time data from, in this case an Event Hub to an KQL database or a Lakehouse.

Be aware that you get a Json file that combines multiple lines. This you need to expand before you ingest it in your destination:

When that is done you can see in the diagram that everything is working and you can look into your Lakehouse or KQL database the entries.

KQL Database

Inside the KQL database you can very easily use KQL code to analyze the records:

You can also create a Materielized view. In this case you can create several views on all the tables:


Then you load it into a lakehouse you can see that the records (dimensions) are in a Json format structure:

If you want to analyze the results you can move it in another structures Warehouse or Lakehouse with the following Python code:

from pyspark.sql.functions import col
df = spark.sql("SELECT ArrayValue_records FROM BusinessCentral.traces")
df ="ArrayValue_records.*"), col("ArrayValue_records.Properties.*")).drop("Properties")  


And then offcourse you can create your reports in PowerBI!

Leave a Reply

Your email address will not be published. Required fields are marked *

The reCAPTCHA verification period has expired. Please reload the page.