web
You’re offline. This is a read only version of the page.
close
Skip to main content

Notifications

Announcements

Community site session details

Community site session details

Session Id :
Power Platform Community / Forums / Power Apps / Exporting large amount...
Power Apps
Unanswered

Exporting large amount of data from Dataverse to Azure SQL

(0) ShareShare
ReportReport
Posted on by 45

Hello all.

 

I have a requirement to export a large amount of data (10-20k rows at a time) from a Dataverse table into Azure SQL. I can't use a flow since they max out at 5000 rows. We calculate daily on thousands of matters each day, a process which completes at 3am local time. All of these interest accrual rows need to be passed into an SQL database hosted in Azure.

 

Doing a search through the forum, I've found a link to https://learn.microsoft.com/en-us/power-platform/admin/replicate-data-microsoft-azure-sql-database but the article states that the option is no longer supported.

 

Can anyone give me some insight on how to accomplish this? Happy to answer questions that may lead to a solution.

I have the same question (0)
  • Shashank Bhide Profile Picture
    942 Moderator on at

    I'll suggest you use azure data factory or SSIS pipelines.

  • therufus Profile Picture
    45 on at

    Can you expand on that at all? I have this in Azure:

    therufus_0-1709874817270.png

    Not sure where to go from here.

  • Shashank Bhide Profile Picture
    942 Moderator on at

    no no no, I was totally confused for a second, the image you are showing looks like an azure active directory (aka Entra ID) application with the name "azure data factory", azure data factory is a separate component and you need data migration skills to write a pipeline. you can check my article below, unfortunately I couldn't complete the second part.

     

    https://www.linkedin.com/pulse/using-adf-large-size-data-ingestion-dataverse-part-1-shashank-bhide/?trackingId=gZcgoTIgQJqekG5W444G4w%3D%3D

  • Federico Pelliccia Profile Picture
    143 on at

    What kind of flow are you talking about? If you are talking about power automate and you are NOT using fetchXml, you can go in the setting and set to use the pagination. It works until 100k record.
    You could also create your own pagination, if you need to use the fetchXml for complex query.

  • therufus Profile Picture
    45 on at

    Hey guys.

     

    Thank you for your assistance so far.

     

    Essentially, we have a large amount of files that accumulate interest on a daily basis. The interest is calculated within 365 Dynamics based on the information on file, then that interest information is added to a dataverse table.

     

    That information needs to be fed back into an Azure SQL database so that when payments are made against those files, each interest item gets paid off and updated to reflect the change. A dataflow can handle the update back to the dataverse, but getting the information out of the dataverse is the hard bit (since Microsoft have neglected to make anything that goes the other way).

     

    I really need two solutions, one to copy all of the existing information from the database into SQL (over 1m rows) as a one-off ingestion process, and a daily process that just inserts the delta.

  • Fubar Profile Picture
    8,352 Super User 2025 Season 2 on at

    As already mentioned in an earlier post. Power Automate Flow defaults to 5,000 but there is a setting (expand top right of the node) that will let you increase the default.   Also, if larger than the setting allows, Flow also supports the 'get next' / paging cookie for even larger data you just need to structure your flow correctly to loop until.  https://powerusers.microsoft.com/t5/Power-Automate-Community-Blog/Page-large-result-sets-Dataverse-in-Power-Automate/ba-p/2199577

     

    Also not sure exactly what you mean by this "A dataflow can handle the update back to the dataverse, but getting the information out of the dataverse is the hard bit (since Microsoft have neglected to make anything that goes the other way)" - you should be able to achieve what you want with Power Automate Flow, DataFlow, SSIS, or Logic App (based on what you have written all have the capability to do what you need).

  • therufus Profile Picture
    45 on at

    I tried using the pagination suggestion above. After half an hour, I only got 5000 records:

    therufus_0-1710309223547.png

    therufus_1-1710309247610.png

    It's like it just ignores my request. Am I doing something wrong?

  • Federico Pelliccia Profile Picture
    143 on at

    Are you using a fetchXml inside the GetAccruals?

  • therufus Profile Picture
    45 on at

    No FetchXML.

     

    I just don't understand why I can't "Dataflow" information back into SQL.

     

    1. Automated process calculates and inserts records into an interest accrual table each day in the dataverse.

    2. I need something that will sync that with a table in Azure SQL (hence this post).

    3. Further data operations in SQL after the fact.

     

    I mean, I could just trigger an update/insert whenever a record gets created in the interest accrual table, but with 10's of thousands of records a day, the AzureSQL server would get absolutely hammered with all the individual calls.

     

    Is this 2024 or 1982?

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Forum hierarchy changes are complete!

In our never-ending quest to improve we are simplifying the forum hierarchy…

Ajay Kumar Gannamaneni – Community Spotlight

We are honored to recognize Ajay Kumar Gannamaneni as our Community Spotlight for December…

Leaderboard > Power Apps

#1
WarrenBelz Profile Picture

WarrenBelz 721 Most Valuable Professional

#2
Michael E. Gernaey Profile Picture

Michael E. Gernaey 320 Super User 2025 Season 2

#3
Power Platform 1919 Profile Picture

Power Platform 1919 268

Last 30 days Overall leaderboard