Hi,
In database design we spend ages normalising data to reduce duplication and improve data quality.
However, it easy to get carried away sometimes and end up over-normalising data to the point it takes longer to put it back together in order to make any sense of it. (I'm certainly guilty of this in the past when starting out with SQL.)
In Azure SQL we have a handy tool called a 'Stored Procedure' that can manage all the inner joins, lookups and aggregations, etc. to produce a simple flat table that Power BI can call and then simply finish off the math.
Is there an equivalent method to Stored Procedures in Dataverse?
Here's the problem I'm faced with.
Step one.
Multiple 'normalised' Dataverse tables that need to be joined up into a flat table.
This is a simple diagram of what I'm trying to achieve (table and column names are for illustrative purposes) - essentially the reverse of what we normally do.

Step two.
If that can work with the standard Dataverse tables the next stage is to repeat the process but this time using Virtual Tables to pull in data from the various end points, including SharePoint lists, Oracle (Cloud and on-premise), Salesforce, and even the odd Excel spreadsheet (if possible).
In one combination (in the same database) I currently have to use 6 linked tables just to match and extract the two values needed - nuts. In another combination I have to pull 4 tables from 3 separate endpoints in order to get the two values required.
Is even part of this possible in Dataverse?
If not the alternative suggestion we have is to use multiple Flows to pull and aggregate the data and then write this to a combined table. Not a great solution as the flows would need to be running almost permanently as a good proportion of the data is constantly changing.
I expect others have been faced with a similar issue but couldn't find any specific posts on how to resolve it... maybe it's not possible.