I am running an automation to pull data from Dynamics into an excel file in Sharepoint. It works, but after repeatedly noticing that I got 5000 rows I think I'm up against a limit on the number of rows in a single operation. So I tried following this blog: Power Automate: Read Excel File – Piyush K Singh
I built my whole automation and it saves without errors. But when I test it, I routinely get "Error - Action List_Rows failed". Error details: Malformed XML.
Strangely, it is blaming my "List_Rows" section, but that is almost identical to the original version of this thing (non-batch-attempts) which works fine. All I am adding in this version of this step is 5000 in the Row Count and the skipCount variable (as described in the blog), and changing the pagination settings. However, when I started having problems I tried removing each of those elements from the ListRows and I kept getting the same Malformed XML errors, so I think I am hitting a bigger problem...
Here is my "ListRows" code:
{
"inputs": {
"host": {
"connectionName": "shared_commondataserviceforapps",
"operationId": "ListRecords",
"apiId": "/providers/Microsoft.PowerApps/apis/shared_commondataserviceforapps"
},
"parameters": {
"entityName": "sr_enrollment_tracker_activities",
"$select": "_sr_account_value, sr_employeename, description, subject, statuscode, statecode, sr_clientactivityresult, sr_employeename, _ownerid_value, scheduledstart, sr_callnotes",
"$filter": "scheduledend gt@{addDays(utcNow('yyyy-MM-ddTHH:mm:ssZ'),-30)}",
"$top": 5000,
"$skiptoken": "@variables('skipCount')"
},
"authentication": "@parameters('$authentication')"
}
}
I am going to mark this complete. I gave up on the batch operation and found a way to deliver the data from Dynamics over to Excel as an array. Then I can distribute the array into a table using an ExcelScript (which I am way better with). I had to break it into multiple runs (gather data, convert to array, deliver array to excel, run script to distribute data into table... then repeat until all the data shows up). In the end I am getting about 12000 rows of data and the whole automation now takes about 90 seconds instead of 4-5 hours and requires no manual intervention to reset itself for tomorrow's data.
#Winning
Ok yeah its definitely the Apply to each that's taking forever. Here's 35 minutes into this morning's run of OldFlow:
Thank you again for helping. Lots to discuss:
1. In Batch flow, I changed my ListRows: I turned on Pagination and set that to 20,000 (number assumed to be larger than my total number of rows; or should I just say 5000 because I'm setting the size of each batch here?). I also cleared out the Row Count and Skip Token rows. Now it seems to run! But it's still SUPER slow! I opened the destination file and my stopwatch. It filles out 10 rows in 15 seconds. It seems to be the "Apply to each" part that's taking so long.
Edit to add: After 391 rows, BatchFlow timed out. Its getting way less data than OldFlow.
2. How are you getting 12000 rows in 6 seconds?!? If I could do that with OldFlow, I wouldn't have to mess with this Batch business. My Old Flow looks like this...
It pulls a pre-fabricated report from dynamics and grabs all entries that are less than 30 days old through forever into the future. It then dumps that data into a table in an excel file stored in our sharepoint. That's it!
When I realized 5000 rows was a problem cap lately, I did the following:
In ListRows, I put in the RowCount of 20000. I also turned on Pagination and set the Threshold to 20000 as well.
In ApplyToEach, I turned on Concurrency Control and set it up to 50 (max).
It still takes 2-4 hours or more. I am pretty sure the time is in the adding data into the excel file. If I open up that excel file while the automation is running, I can watch it enter data. It goes pretty slowly. I just cant tell if that's because it's ony finding out about the data for each row (from dynamics) slowly or it already "knows" the data and it is taking time to slowly add it into that excel file. Here is my "Apply to each" (i actually have 2 very-similar flows with slightly different "Apply to each" sections)...
Any ideas how to get data quicker?
For your Dataverse List rows action you should leave Row count and Skip token blank and just set the Pagination/Threshold. This will return your items in batches up to the amount you specify for your Threshold. In my example, I'm returning 12,000 items and takes approx. 6 seconds to return all the items.
Is it your List rows action that's taking a long time when you run the flow? I'm still not sure what you're doing in your Do until and Apply to each.
When I run the flow.
Overall flow:
List Rows 2:
I am grabbing a pre-prepared report from our Dynamics instance, and only grabbing the columns I need (to save resources). I am filtering to anything that is within the last 30 days through forever into the future. This much of it is exactly the same as my prior automation, which works fine but is slow and only gets 5000 rows.
Both the Row Count and the Skip token are new elements supposedly needed per the instructions I am following.
Settings for List Rows 2:
This part is not in my original automation either.
Let me know if there is anything else that would be helpful to see. I am interested in getting the batch operations to work since I think it would speed things up. My prior flow takes over 2 hours to run and only gets 5000 rows.
I think the speed of the batch operations is also desirable, but I did try setting the cap to 10000 yesterday. Last night's overnight pulled a full 10000 rows, but strangely still only took 2 hours. (Why would the time not be correlated to the amount of rows being pulled?) So rightnow I am running a test of the old flow with a cap of 20000.
Thanks for the help!
Are you able to show a screenshot of your flow to get a better understanding of what you currently have?
If you want to get more than 5000 items from your Excel file you don't need to set the Top Count or Skip Count (can leave them blank) and just set the Pagination/Threshold - this will run through and return up to your Threshold.
WarrenBelz
146,658
Most Valuable Professional
RandyHayes
76,287
Super User 2024 Season 1
Pstork1
65,999
Most Valuable Professional