I'm developing a solution that integrates both Power Apps and Power BI, sharing a common data source. The data originates from two primary sources:
Multiple tables in Snowflake (approximately a dozen)
Azure DevOps (ADO) via an OData connection
Additionally, I plan to embed some Power BI report visualizations within the Power Apps solution. My main concern is establishing the most efficient and maintainable process for data ingestion and modeling. Specifically:
What's the recommended approach for importing the data and constructing the data model?
Should I utilize a Power BI dataflow, a Dataverse dataflow, or an alternative method?
How can I avoid duplicating efforts in data management between Power Apps and Power BI?
What's the best way to ensure that any changes to the data model can be implemented in a single location, affecting both Power Apps and Power BI simultaneously?
I'm seeking a solution that minimizes redundancy and simplifies ongoing maintenance of the data pipeline and model across both platforms.
Also, we will not be utilizing any of the Fabric premium functions.
Use Dataverse Dataflows to pull data from Snowflake and ADO.
Model data centrally in Dataverse.
Use Dataverse tables in both Power Apps and Power BI.
if you need more details, let me k now.
Was this reply helpful?YesNo
Under review
Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.