Skip to main content

Notifications

Community site session details

Community site session details

Session Id : OW5TSM3Gkj7HnJRZKl0m2e
Power Apps - Microsoft Dataverse
Unanswered

Finance and Operations tables in Dataverse

Like (1) ShareShare
ReportReport
Posted on 13 Feb 2024 07:23:08 by 54
Hi,
 
We have recently purchased a Microsoft Fabrics service which we'll be using as our data platform throughout our organization.
Previously, we integrated our Finance and Operations data with the BYOD concept to an Azure SQL which we used as a Data warehouse, however we would like to setup and integrate our F&O data with our Microsoft Fabrics.

According to Microsoft documentation, this should be possible to do through Power platform.
Currently, our Finance and Operations environment is linked to a Power platform environment through Lifecycle Services, which has Dataverse enabled.
However we're not able to see our Finance and Operations tables under the make.powerapps.com => Tables menu.
Our goal is to do this without enabling Dual-Write, since there's no need to dual-write for our case.
 
We've looked through the official Microsoft documentation for this but are not able to find the issue.
 
The configuration in LifeCycle Services is according to Microsoft Documentation for linking Finance and Operations with Power Platform.
My security role in the environment is Dataverse System Administrator, and I'm able to see other tables which are not Finance and Ops.

Any help or guidance on why we cannot see the Finance and Operations tables in our Dataverse would be much appreciated!
  • HMSNemanja Profile Picture
    54 on 30 May 2024 at 09:33:20
    Re: Finance and Operations tables in Dataverse

    Hi @sohaiby 

     

    Glad to hear that it's of some help.

     

    Regarding your questions:

    1. You mentioned about removing the Dataverse tables from the Fabric capacity, but I am not able to deselect the Dataverse tables from the Manage Tables screen. I can only select/unselect the F&O tables. Is there a way to do that?

    This is per design.

    When the "Link to Microsoft Fabric" functionality was under Public preview, it was in fact possible to both manually select which tables to include, and it was also possible to manually de-select tables.

    However this was changed during the GA phase, meaning that it's no longer possible to manually select tables or to de-select them once the link is established.

    HMSNemanja_0-1717058770701.png

     


    That being said, there is a way to de-select tables from Dataverse, but this is through a bug which i described on page 2 of this thread (Option 3):

    HMSNemanja_1-1717058840159.png

    But I do not recommend using this, and I have personally reported this to Microsoft during Q1 of this year.
    If you want to try this anyways, I strongly recommend that you read through my post on page 2 and understand the consequences beforehand. 

     

    Regarding your second question:

    I have little to no experience in using Finance and Operations entities. This has been a deliberate choice from my side since support for adding entities in the Link to Microsoft Fabric was not in place when I created this for our company. Also, I wanted to go directly on the Finance and Operations tables to avoid having to maintain version changes in the entities.

    However, these is official documentation for it:

    https://learn.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-select-fno-data#enable-finance-and-operations-data-entities-in-azure-synapse-link

     

    Also, make sure to check the known limitations:
    https://learn.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-select-fno-data#enable-finance-and-operations-data-entities-in-azure-synapse-link

     

    Sorry that I could not offer more help here.

    But let me know if you are not able to move forward and I'll do some investigating in our environment. 

  • sohaiby Profile Picture
    9 on 30 May 2024 at 07:45:34
    Re: Finance and Operations tables in Dataverse

    Hi @HMSNemanja ,

    Appreciate your detailed response. That clears a lot of things.

    I do have a couple of questions here.

    1. You mentioned about removing the Dataverse tables from the Fabric capacity, but I am not able to deselect the Dataverse tables from the Manage Tables screen. I can only select/unselect the F&O tables. Is there a way to do that?

    manage tables.png

    2. We have some data entities in D365 that we are exporting to BYOD for Power BI reporting, and we want to use the same in our Fabric capacity as well. I have made those entities visible as Virtual entities in D365 CRM settings (as per my understanding this is what was required), but I'm not able to see them in the Manage Tables screen. Any idea what could be the issue? (Change tracking is enabled in those entities).

     

    Many thanks.

  • HMSNemanja Profile Picture
    54 on 29 May 2024 at 17:27:29
    Re: Finance and Operations tables in Dataverse

    Hi @sohaiby 

     

    Glad to hear that you've found some of the information helpful.

    Since we've been through this process, and are now pretty far into it, I understand some of the confusions and frustrations this can create, particularly since the official Microsoft documenation is quite unclear.

     

    I've come to realize that the main reason for the confusion is the mixed terminologies used in the documenation.

    Since the documentation is written by different teams at Microsoft, and since Power BI is being merged into Microsoft Fabric, a lot of terms, words and technologies are being mixed together.

     

    That being said, your consultant is correct, the traditional "Azure Synapse link" and the "Link to Microsoft Fabric" are to independently different solutions. However they are re-using the a lot of the same back-end architecture & components in Azure.

    Meaning that with the traditional Azure Synapse Link, you had two options:

     

    1. You would need a Gen2 ADLS account to act as a Datalake (incremental export to CSV).
    2. Alternatively you would need an Azure Synapse Analytics Workspace with a dedicated spark pool to convert the exports to Delta Parquet.

    But with the new "Link to Fabric" version, option 2 still occurs but in the back-end. Your OneLake is in fact the Gen2 ADLS, and the Fabric capacity automatically creates the export to Delta Parquet for you. You do not have to define the spark pool or spark job for the conversion, it happens automatically for you behind the scenes.

    It does however in fact create an "Azure Synapse Link" which is visible in your Dataverse environment (make.powerapps.com).

    The link is always automatically named "Microsoft OneLake" and it also automatically creates a Lakehouse for you in the designated Microsoft Fabric Workspace (the workspace you've chosen in the configuration Wizard).

    So basically, what you're seeing is according to design from Microsoft.

     

    This means that with Microsoft Fabric, you do not need to have a Azure Synapse Analytics workspace at all. Your Fabric capacity should act as the "entire solution" and should be your only cost.

     

     

    Keep in mind that you may only have one single Link To Microsoft Fabric "Azure Synapse Link" active in any given environment at the time, and that it will automatically include all non-system tables from your Dataverse that have the "Track Changes" property enabled.

    Currently, there seems to be a limit on the number of tables that can be included in the "Link to Microsoft Fabric" synapse link, the limit is set to 1,000 tables.


    Hope this clarifies some of your concerns!

    Let me know if anything else is needed and good luck with your project! 

  • sohaiby Profile Picture
    9 on 29 May 2024 at 15:02:59
    Re: Finance and Operations tables in Dataverse

    Hi @HMSNemanja ,

    Just wanted to say a big thanks to you for answering all these questions and keeping us updated on this.

    We are facing exactly the same issue, where we are currently using BYOD to export F&O data for our Power BI reports, and we are looking for a new and optimized solution because the BYOD process has became painfully slow and almost useless due to our increased data size.

    I have followed the steps of Solution#1 suggested by you as this seems to be ideal for us. The only confusion I faced is with using the Azure Synapse Link to make the F&O tables available in the Fabric Lakehouse. As per Microsoft's documentation and as per our support consultant, Link to Fabric and Azure Synapse Link both are completely independent solutions and could be used independently to export the data to the Lakehouse, but it looks like we still need to use the Azure Synapse Link for Dataverse connection even if we are going for the Microsoft Fabric solution. I am hoping that this won't incur any cost toward the Azure Synapse Analytics usage as we believe that paying for the Microsoft Fabric capacity should be the only cost involved in this solution.

    Again, massive thanks for providing all the updates across all the Dynamics/Power platform forums

  • HMSNemanja Profile Picture
    54 on 03 May 2024 at 07:55:58
    Re: Finance and Operations tables in Dataverse

    Hi SpaceAnalytics,

     

    No worries, you're welcome!

     

    Since Microsoft's official documentation isn't very descriptive when it comes to possible errors in the setup, could you please verify if the synapse link manages to actually create the Lakehouse or not?

    In other words, when you do the "Link to Fabric" configuration, does it create an empty Lakehouse or nothing at all?

     

    Do yourself a favour and double check the accounts permissions (these prerequisites aren't very clear), but a short summary of it is that the account which creates the "Link to Fabric" Synapse link needs to be:

    • Power platform administrator
    • System administrator in the Dataverse environment that you're integrating against
    • Contributor RBAC role on the Fabric resource in Azure (assuming you provisioned it through Azure)
    • Have the Fabric Administrator role in your Entra ID
    • Workspace Admin RBAC role on the Fabric capacity workspace
       

    Also, double check another Power BI tenant setting:

    Tenant settings => Microsoft Fabric => Users can create Fabric Items should be set to enabled and make sure that the account which you're using to setup the integration is included in one of the groups.

     

     

  • SpaceAnalytics Profile Picture
    16 on 03 May 2024 at 07:08:28
    Re: Finance and Operations tables in Dataverse

    Hello HMSNemanja,

     

    For some reason your reply of march 27th was completely missed by me, I apologize because I greatly appreciate the time and effort you put into this thread. I also have a ticket ongoing with Microsoft, but without effect so far.

     

    I will take the time to test the suggested steps. This setting was always activated: Users can access data stored in OneLake with apps external to Fabric.

     

    The regional differences might be a thing. I'll make sure to let you know what I'll find.

  • HMSNemanja Profile Picture
    54 on 29 Apr 2024 at 07:31:22
    Re: Finance and Operations tables in Dataverse

    Hi SpaceAnalytics!

    Was your issue regarding Fabric status resolved for your tables in Fabric?

    Please have a look at your Power BI tenant settings => OneLake settings => Users can access data stored in OneLake with apps external to Fabric 

    Check if that is enabled, if not, shortcuts to tables will not be created in Fabric even if the Lakehouse is automatically created.

  • HMSNemanja Profile Picture
    54 on 27 Mar 2024 at 15:21:20
    Re: Finance and Operations tables in Dataverse

    Hello SpaceAnalytics!

    First off all, I must admit that I have not had the possibility of keeping this post completely up to date with my communications with Microsoft.
    The main reason for that is I've recently had a lot of communication with Microsoft representatives on different levels, unfortunately it would be too big of a task to keep all different community threads up to date.
    I apologize for that!

    Regarding your questions, I can think of two main reasons behind the issues:
    1. Regional location differences between your Finance and Operations environment and your Microsoft Fabric capacity.

    2. The "Quality Update" version and build version of your Finance and Operations environment.

     

    Q:

    1. Have you ensured that all of your environments are in the same regional location?

     

    2. Which version of Finance and Operations are your currently on?

    There are new "known bugs" and minimum versions for resolving them:

    https://learn.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-select-fno-data#known-limitations-with-finance-and-operations-tables

    HMSNemanja_0-1711551875578.png

     

    Also, have you tried to trigger a manual refresh?

    HMSNemanja_1-1711552006411.png


    In our setup, we have so far included 98 FnOTables. All of them have had Fabric status "Created" and Status "Active" within an hour or so from once we've added them.
    However, I do have a table that has Fabric Status "Created" with status "Initial sync in progress", it's been like this for almost 10 days now.

    I have an active MS case regarding this table specifically ongoing right now.
    I was initially told that this was due to our Finance and Operations environment missing the latest Quality Update for our version (Version-10.0.37, QU-10.0.1725.188).
    We got the latest version but this did not resolve our issue.
    The case has now been reassigned from the Finance and Operations support engineers to the Dataverse engineering team.

    I'm expecting feedback on that support request either tomorrow or early next week, and I promise to update this post with the information I receive. 



    That being said, I'm afraid that the issue you're experiencing could be due to a variety of reasons.

    Have you checked your Lakehouse in your Fabric workspace, after you initially created the Link?

    Are you able to see tables in the Lakehouse on the dbo schema, or are they classified as "Undefined" ? 

    Did you rename the Lakehouse after it was automatically created and provisioned (this could cause issues)?

    Somewhat obvious question, but have you verified the license settings in Finance and Operations for the "sql row version change tracking" property?

     

    I would recommend that you do the manual refresh trigger, and see if that helps. 
    If not, I would recommend that you create a support ticket with the Dataverse team immediately (the sooner they can start investigating the better).

    Meanwhile, I can make sure to keep the post updated if I get a reply from the Dataverse support.


    Hope this can atleast help to point you in the right direction.

  • SpaceAnalytics Profile Picture
    16 on 27 Mar 2024 at 10:08:10
    Re: Finance and Operations tables in Dataverse

    Hello HMSNemanja,

     

    We experienced some delays with this project, but first and foremost, thank you for your previous explanations and for answering our questions. Based on your experiences, it seems that the Fabric link is the preferred solution compared to the alternatives, yet we have not managed to get it working.

     

    In our initial attempt, we successfully created a Fabric link and were able to view all the default tables in Fabric. However, our interest lies solely in the F&O base tables. For the POC, we activated two of them (dirpartytable and custtable). We ensured all prerequisites were met and added them using the 'manage tables' interface. However, they never synchronized with the Fabric lakehouse, and I noticed their status remained 'deleted'. This was confusing, as we never actively deleted them from either the Fabric link 'manage tables' interface or the Fabric lakehouse itself.

     

    Today, we recreated the Fabric link and reactivated the two tables. It is currently undergoing another initial sync, but the deleted status appears by default again. What has been your experience with this? Is this related to the delayed sync issue you discussed with MS?

     

    I am considering reverting to the more reliable Synapse link, as I feel Fabric link lacks the robustness our production environment requires. However, I still hope to get the Fabric link functioning.

     

    Thank you in advance, you've been of great help so far!

     

    screen.png

     

    UPDATE 3 hours later you can see that the sync is active, but the tables are still 'deleted'. They are also nowhere to be found in the newly created Lakehouse in Fabric.

     

    screen2.png

     

    Update 2 After hitting 'refresh tables in fabric', status update is 'failed' without further explanation.

     

    screen3.png

  • HMSNemanja Profile Picture
    54 on 01 Mar 2024 at 10:07:58
    Re: Finance and Operations tables in Dataverse

    Hi SpaceAnalytics,


    Glad to hear that you were able to establish the connection successfully!


    I would say that it depends on :
    1. Will you be using the Dataverse tables that are also included by default?

    2. How many Dataverse tables do you have?

    3. How many PowerBI users do you have?

    4. How many layers will your solution have (bronze, silver, gold) ?

     

    The reason I'm asking is because if you have a lot of Dataverse tables which you're not planning to use, you'll have to design your entire development around explicitly declaring which Lakehouse objects you want to work it, and ignore the rest.
    Also, I noticed that a Lakehouse which contains 800+ tables, causes heavy latency when working in the Lakehouse & SQL Endpoint (only tried in F2 capacity).

    If you're in this scenario, my suggestion is to create a new Lakehouse and use notebooks or pipelines to replicate the tables you want to use to the new Lakehouse instead.

    I also strongly recommend that you have separate workspaces for your solution layers, as the RBAC is currently on the Workspace level.

    I have also described in this thread on how to remove Dataverse tables from the "Microsoft OneLake" Synapse Link which gets created when you use the "Link to Fabric" functionality. 
    However since I have not yet received confirmation from Microsoft on whether this is intentional functionality or not, I do not recommend moving forward with it right now.


    All of that being said, the CSV approach to a separate ADLS Gen2 does certainly require a bigger effort regarding development and maintenance. 

    During a Fabric AMA session with the Azure Fasttrack engineers, I raised this question with them.

    The response was that they do not officially advise from not using this approach, but their recommendation was that in this specific case, try to avoid using spark notebooks for converting from CSV to Delta as this is compute intensive (use pipelines instead).
    This approach does certainly also entail more compute usage as you're not longer getting your data into Delta Parquet format "for free" as you do with the Link to Fabric approach.


    In short: Yes, I would agree that option 1 is the overall best solution and it's clearly the desired architecture from Microsoft's point of view. However I would recommend that you take your data volume, ETL process and number of end-users into consideration since if you have a lot of data with heavy ETL processes and a lot of end-users, you might want to consider creating two separate Fabric capacities.
    For example, instead of using F64 for your entire solution, you might want to consider using F32 for ETL and F16 for PowerBI datasets.

     

     

    On the option 1 note, I've received another reply from the Microsoft Dynamics / Finance and Operations support engineers:
    "

    We have the following updates from the engineering team.

     

    We do have work items planned to reduce the sync latency. However we are unable to provide an estimated time of completion. 

    Kindly note that , currently , there isn't much we can do to reduce the overall latency.

    Appreciate your understanding and cooperation in this matter.
    "

     

    My follow-up question was:
    "

    Thank you for the update and information, very much appreciated!

     

    Can I interpret the response from the engineering team, that the current latency solely depends on the size of the source table?

    Or is there any risk that the latency could increase depending on the number of tables in our  â€śMicrosoft OneLake” Synapse Link?

     

    On this subject, according to your documentation, all Dataverse tables which have “Track changes” enabled are included by default when using “Link to Fabric”.

    Is it possible to manually remove individual tables from the Dataverse source, if we go to Manage Tables in the “Microsoft OneLake” Synapse Link?
    "

     

     

    Let me know if you have more questions.

    Best of luck!

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

🌸 Community Spring Festival 2025 Challenge 🌸

WIN Power Platform Community Conference 2025 tickets!

Markus Franz – Community Spotlight

We are honored to recognize Markus Franz as our April 2025 Community…

Kudos to the March Top 10 Community Stars!

Thanks for all your good work in the Community!

Leaderboard

#1
WarrenBelz Profile Picture

WarrenBelz 146,651 Most Valuable Professional

#2
RandyHayes Profile Picture

RandyHayes 76,287 Super User 2024 Season 1

#3
Pstork1 Profile Picture

Pstork1 65,999 Most Valuable Professional

Leaderboard

Featured topics

Loading started