BC2ADLS: MS Fabric enabled part 1

BC2ADLS: MS Fabric enabled part 1

In my previous blog post:
 Let’s flow your Business Central data into Microsoft Fabric – Discover Microsoft Business Central (bertverbeek.nl)
I showed you how you can use the Dataflow of Business Central to flow your data into Microsoft Fabric.
If you refresh the dataflow it will load all your data of the tables again in your Microsoft Fabric Onelake. It doesn’t use the delta’s.

But there is also an API for Onelake to upload your data:
OneLake access and APIs – Microsoft Fabric | Microsoft Learn
The API doesn’t differ much from the Data Lake Gen2 API:
Azure Data Lake Storage Gen2 REST API reference – Azure Storage | Microsoft Learn

And since we have the tool bc2adls:
microsoft/bc2adls: Exporting data from Dynamics 365 Business Central to Azure data lake storage (github.com) 
This extension exports Business Central data into an Azure Data Lake we can transform that into the Onelake of Microsoft Fabric!

In part 1 we discover the Business Central Side. In the next part we will discover the MS Fabric side.

NB: this is still a proof of concept and not pushed into the main branch of the bc2adls repo.
The sourcecode you can find here:
Bertverbeek4PS/bc2adls at Microsoft-Fabric-Integration (github.com)

We are importing the data as files into a lakehouse inside Microsoft Fabric.ᅠ
A lakehouse is for unstructured data and can consist of tables and files.
Warehouse is for structured data and has only tables.
Since Business Central is structured data but we are uploading files we are importing it in a lakehouse. From there we are creating tables in the lakehouse:

For more differences of a lakehouse and warehouse you can look here:
Fabric decision guide – lakehouse or data warehouse – Microsoft Fabric | Microsoft Learn

Authentication

You can authenticate OneLake APIs using Microsoft Azure Active Directory (Azure AD) by passing through an authorization header. If a tool supports logging into your Azure account to enable AAD passthrough, you can select any subscription – OneLake only requires your AAD token and doesn’t care about your Azure subscription.

So in this case we have to create an app registration in the Azure portal with the following API Permissions:

This must be granted by a global admin since it uses username and password.
Also we use https://storage.azure.com/.default as scope and https://storage.azure.com/ as resource.

Creation of the delta file

If we have authenticate we have to create a file.
The url of onelake is as follow:
https://onelake.dfs.fabric.microsoft.com/<workspace>/<item>.<itemtype>/<path>/<fileName>

In my example my workspace is called FabricTest and my lakehouse is called BusinessCentral.
So to base URL would be:
https://onelake.dfs.fabric.microsoft.com/FabricTest/BusinessCentral.Lakehouse/Files
You can also look it up in your lakehouse:

And when we want to create a file we have to call the base URL with ‘?resource=file’.

When we have done that we can create files into Onelake.

Adding data to the file

After the file is created we have to put the delta’s in a CSV format into the created file. 
For this we have to call the file with ‘?position=0&action=append&flush=true’.

In this way we append to the fill the created file on position 0.

Implementing in the bc2adls extension

When we are implementing it in the bc2adls we first have to create a setup:

We have to fill in the Workspace, Lakehouse, Username and password.
After that we can select the tables and fields as usual and if we export the files they are exported into the Lakehouse inside MS Fabric in the same structure as exporting to the Azure Data Lake:

After this we have to consolidate the delta’s into a table and remove the deleted records. This will be done in part 2.

10 COMMENTS

Martin Goedhart

Hi Bert,

Nice coincidence, I was just redirected to bc2adls in my search for proper data extraction from BC and my first thought was: can it be done with Fabric instead of ADLS. So first of all thank you for your efforts in adjusting bc2adls. I’ve started trying out your steps detailed here, but I’m afraid I lack some prior knowledge of bc2adls. I do not understand the step ‘Creation of the delta file’. Where do you use these URLs? I have created a lakehouse in a workspace in Fabric, but when I simply put the corresponsding URLs in the browser I get an authentication error. It seems I’m missing a step where the URLs are supposed to be used.

I’ve set up the APP Id with client secret, I’ve installed your fabric version of the bc2adls extension in BC and I created a lakehouse. But missing one piece of the puzzle to complete the export it seems. Could you help me out?

    Bert Verbeek

    I have send you an email about it.
    Branche has one small bug and is updated.

Jesper Theil

So the Data Fabric primarily replaces the Azure Data Lake Gen2 / Storage Account here?
I assume you will then reimplement pipelines similar to what we have today in bc2adls to consolidate deltas etc?

Since the pipelines are the heaviest in terms of cost/compute – will the Data Fabric do anything to change the cost profile of the bc2adls solution you think?

    Bert Verbeek

    Well you can choose the option to export.
    Data Lake or Fabric. That is up to you.
    In Fabric there will be a notebook script to consolidate the data. I have the script ready. But not tested it with a large dataset.
    But it looks like the notebook script can be very simple and easy/quick to handle. In that way the costs are less.

Sam Saeidpour

Hi Bert,
Thanks for sharing your knowledge. Actually I have the same problem of “Martin Goedhart”. Would you please send me that email too?

Also in Authentication section, you said: “Also we use https://storage.azure.com/.default as scope and https://storage.azure.com/ as resource.” Where should I put those scope and resource?

    Bert Verbeek

    Hi Sam,
    I have send you an email. Because the mail from Martin is in Dutch. And there was also a bug in the app.
    /Bert

BrianWrask

청주출장샵수원출장마사지,카톡:po03 24시간언제든지예약신용믿음안전수원출장샵,기차여행대구출장안마소(TALK:PC90)24시간조건만남가능울산콜걸,동성애자존중만남모임[KaKaoTalk:ZA31]강남출장샵시급20만원-싸다콜걸강남콜걸, 충무로출장안마시술소(카톡:po04)고객만족-최저가-신용-고객만족1위
통영출장마사지 거제콜걸(TALK:kn39)강원도수원출장안마시술소TALK:vB20 https://hipsterlibertarian.com/7504/강남출장섹스마사지

Leave a Reply

Your email address will not be published. Required fields are marked *


The reCAPTCHA verification period has expired. Please reload the page.