BC2ADLS: MS Fabric enabled part 2

BC2ADLS: MS Fabric enabled part 2

In part two we are looking at the Fabric side of the integration with Business Central and bc2adls extension.

Part one you can read here:
BC2ADLS: MS Fabric enabled part 1 – Discover Microsoft Business Central (bertverbeek.nl)

With the export to Fabric all the delta files are in place and must be consolidated into a new table or an excisting table.

That is done through a notebook script in Python in three steps. But first the settings:

In this case we have to set some parameters.
The first to parameters are the standard folders that doesn’t have to change.
The Workspace parameter you can fill in your workspace name or the GUID of it. That is also the same as the lakehouse parameter.

For the Remove_delta if you set it to True then all your delta files will be removed. In a production environment please put this always to True. Otherwise the step of removing duplicates will take a long time.

 

From CSV files to Dataframe

First we have to pick up all the CSV files from the directory and load it into a dataframe:

After that is loaded we are changing all the column types based on the manifest file we have exported from Business Central. In that way the column types are the same as in Business Central:

Delete the deleted records

When a record is deleted in Business Central it will get an entry in the table “ADLSE Deleted Record”. Those records will also be exported but with a field “SystemCreatedAt” that is blank. In this way we can filter on it in a new dataframe and do a join with our current dataframe:

A ‘left anti join’ returns only the rows from the left dataframe (df_new in this case) which do not have a matching key in the right dataframe.

Overwrite the changed records

Then the last action (before importing it into a lakehouse table is to overwrite the changed records in Business Central.
In that case we are sorting the dataset on systemId and SystemModifiedAt. After that we drop the duplicated.
With the function dropDuplicates it doesn’t drop the first record. But all other records will be dropped:

Now we are done and can override the dataframe with the data in the excisting table.

Schedule the notebook

You can also schedule the notebook. In this way you don’t have to create a pipeline and schedule that:

You can select the Repeat option. The start en end day is optional:
 

Access your date in PowerBI

In PowerBI desktop you can choose get Data in you can see there also Microsoft Fabric:

After that you can select your lakehouse:

And your Business Central data is in your PowerBI:

Because the data is already in OneLake and in a lakehouse with delta tables it can use Direct Lake. So it doesn’t have to load all your data into a dataset (like data from an Azure Data Lake) and present it in Power BI:
Direct Lake feature diagram.

Learn about Direct Lake in Power BI and Microsoft Fabric – Power BI | Microsoft Learn

Alls the code of the notebook you can find in this branche:
bc2adls/fabric/CopyBusinessCentral.ipynb at Microsoft-Fabric-Integration · Bertverbeek4PS/bc2adls (github.com)

8 COMMENTS

Martinreono

I gave https://www.cornbreadhemp.com/products/full-spectrum-cbd-gummies a whack at for the maiden adjust, and I’m amazed! They tasted excessive and provided a intelligibility of calmness and relaxation. My emphasis melted away, and I slept less ill too. These gummies are a game-changer on the side of me, and I enthusiastically recommend them to anyone seeking unconstrained pain liberation and think twice sleep.

Martí

Hi, thanks for sharing this amazing tool. I have tried to follow all the steeps but I am facing an error when trying to execute “CopyBusinessCentral” notebook. The error is clear: “No such file or directory: ‘/lakehouse/default/Files/deltas/’ ” as it should be the default. This directory is defined in settings for “folder_path” or “folder_path_reset”.

    Bert Verbeek

    Hi Marti,
    Do you have entered your correct lakehouse and workspace in the script? And is your lakehouse the default lakehouse attached to your notebook script?

Martí

Hi Bert, thank you for your quick replay. This “No such file or directory” is solved.
Everything works fine, and I have a new CSV file in “Deltas” directory everytime I export one table. But despite it works fine, I get an error in the listPart “ADLSE Setup Tables” (in Business Central): Invalid GUID string format, with the following error details:

“ADLSE Communication”(CodeUnit 80003).GetBaseUrl line 15 – BC2ADLS by Castañer
“ADLSE Communication”(CodeUnit 80003).FlushPayload line 35 – BC2ADLS by Castañer
“ADLSE Communication”(CodeUnit 80003).Finish line 2 – BC2ADLS by Castañer
“ADLSE Communication”(CodeUnit 80003).TryFinish line 3 – BC2ADLS by Castañer
“ADLSE Execute”(CodeUnit 80005).ExportTableUpdates line 52 – BC2ADLS by Castañer
“ADLSE Execute”(CodeUnit 80005).TryExportTableData line 14 – BC2ADLS by Castañer
“ADLSE Execute”(CodeUnit 80005).OnRun(Trigger) line 48 – BC2ADLS by Castañer

    Bert Verbeek

    Hi Marti,
    What you can do is put the guids of the workspace and the lakehouse in Business Central.
    The GUIDS you can find in the URL of Fabric. That is a much better experience for the API’s

Marti

Thanks Bert. Solved!!

Leave a Reply

Your email address will not be published. Required fields are marked *


The reCAPTCHA verification period has expired. Please reload the page.