We have an old 40GB Dataverse environment which is no longer used (storing 2021 data and older), but from which we still want to run PowerBI reporting on. Because the reporting will be infrequent, not time-sensitive and only include a limited number of queries, our main consideration is for the data storage to be low-cost.
So far, we have identified two options to archive our database:
(1) Export to a SQL Server database.
(2) Create an Azure Synapse Link to sync data to an Azure Data Lake Storage Gen2. This appears to be the Microsoft-recommended method, but cost impact is unclear because e.g. Synapse Analytics is very expensive and Im not sure if data can be accessed without it
(3) There may be other options, but our team is not super-familiar with Azure yet and learning, so Im all ears to other recommendations
Both methods work, but for 40GB of Dataverse table data, what is the most cost-effective way of storing it?