Two ways of exporting BC telemetry

Two ways of exporting BC telemetry

It is a good thing to have Application Insights enabled in your Business Central extension of NST. But sometimes you get allot of data in your Application Insights. In 4PS we get per day 25GB of ingested data. And that will grow even more in the feature which costs allot of money. But you want also store some events for analysis.

You could set your retention period higher then the standard 30 days. But that will cost you allot. There are other ways to store your important telemetry for a lower price.

  1. Diagnotic setting

The first way of exporting is that you turn on diagonistic settings in your Application Insights. Here you can export your telemetry to a storage account:

Put a checkmark in “AppTraces” and “AppPageViews” also you can fill in the “Retention (days)”. Zero is the default and then it would not be deleted in your storage account. If you put 30 then after 30 days it will be deleted on the storage account.

Application Insights will export each hour a JSON file in your storage account.

2. Azure Data Factory

The other way to export your data is through “Azure Data Factory”.

In Azure Data Factory you have to create a pipeline with the “Copy data” in it:

On the source site create a connection with the REST API of Application Inisghts and put the “Request Method” on Post:

For the “Reqeust Body” I used this KQL query. Please at the end work with project because you want to flatten the output:

{
  "query": "let date_to_query = startofday(datetime('@{formatDateTime(variables('windowStart'),'yyyy-MM-dd')}'), -1);traces | where startofday(timestamp) == date_to_query | where customDimensions.eventId == 'RT0005' or customDimensions.eventId == 'RT0006' or customDimensions.eventId == 'RT0011' or customDimensions.eventId == 'RT0012' or customDimensions.eventId == 'RT0013'| project   eventId = tostring(customDimensions.eventId),   aadTenantId = customDimensions.aadTenantId,alObjectId = customDimensions.alObjectId,alObjectName = customDimensions.alObjectName,alObjectType = customDimensions.alObjectType,clientType = customDimensions.clientType,companyName = customDimensions.companyName,component = customDimensions.component,componentVersion = customDimensions.componentVersion,environmentType = customDimensions.environmentType,executionTimeInMs = customDimensions.executionTimeInMs,extensionId = customDimensions.extensionId,extensionName = customDimensions.extensionName,extensionPublisher = customDimensions.extensionPublisher,extensionVersion = customDimensions.extensionVersion,longRunningThresholdInMs = customDimensions.longRunningThresholdInMs,sqlStatement = tostring(customDimensions.sqlStatement),numberOfDocuments = customDimensions.numberOfDocuments,numberOfRows = customDimensions.numberOfRows,result = customDimensions.result,sqlExecutes = customDimensions.sqlExecutes,sqlRowsRead = customDimensions.sqlRowsRead,operation_Name = operation_Name,client_City = client_City,client_StateOrProvince = client_StateOrProvince,client_CountryOrRegion = client_CountryOrRegion"
}

If you have done that in the preview windows you get this Json file:

Add the Sink tab you create a connection with your “Azure Data Lake”:

And now comes the hardest part. You must map the Json fields to a CSV file format.

In the field “Collection reference” just put $[‘tables’][0][‘rows’]

Then on each row mention the following:

And the last part is to create a variable with the name windowStart:

Then you can set the trigger to run every nicht and your Application Inisghts data will be exported to a CSV file:

This file you can open in Power BI with the connection to “Azure Data Lake v2”. How you can deal with the file format you can look here:

Analyze data in Azure Data Lake Storage Gen2 by using Power BI – Power Query | Microsoft Docs

Or you can setup a dataflow.

There is also a small readme on BCTech -> BCTech/samples/AppInsights/Datalake at master ยท microsoft/BCTech (github.com)

1 COMMENT

Leave a Reply

Your email address will not be published. Required fields are marked *


The reCAPTCHA verification period has expired. Please reload the page.