Hello,
I am using Apply for Each to loop through the records and some conditions in LOOP. My observation is that the Flow gets stuck up in Apply for Each and takes 30 mins for looping through 150 records.
Any help is appreciated for improving the performance of the Flow
The Power Automate is not ideal thing. But you are a human and if something forces you to use "Apply to Each" cycle, you can find a way to resist, right?
See https://tomriha.com/stop-power-automate-flow-from-adding-apply-to-each-automatically/
I have a flow that's been running for 26 minutes to put ONE attachment name and date into a spreadsheet. I'd love to get rid of the Apply to Each command, but Power Automate forces you to use it, even though this flow deals with only one attachment per email.
Wow, that's awesome, my company is pretty limiting as far as what I can implement, but I'll see if I can get this working in my flow. Good work. That's pretty much exactly what I mean though, I don't understand why the default API calls are so inneficient. It makes a call per row and there's no out of box solution in flow for batch updating, which is crazy to me, updating 500 rows in excel should never take 20-30 minutes. Thank you for the response, I'll see if I can use this update.
Update:
I got the script working pretty easily, flow time went from ~30m to 1 minute, specifically the add row action which took around 27m was cut down to 29s - 37s, the 1 minute run time is due to other limiters I have in place in the flow on purpose. That's pretty crazy. This should be microsofts default update method. The code is pretty defined as is, but it shouldn't be too much work to make it dynamic.
Yes, it can be pretty slow using standard methods to create & update things in Excel. It’s a little surprising an open-source/community dev can come up with a set-up 50-100x faster than the MS defaults:
Agreed, there's so many limitations I've seen in the Power platform, and innefficient algorithms for handling data. Hell even simple API calls can take a while to transfer data. If google's firebase tools can be real time so can MS Power platform. The company is massive and used in so many businesses, I expect more from them at this point. It's clear that whatever they're using for apply to each isn't optimized at all, I used one apply to each to add new rows to an excel file and it feels like it takes as long as it would take if I had three nested for loops going in JS. It's simple relational data, 1 record in SP = 1 row in excell, so I don't understand why it needs to take so long, especially because it takes time getting items before it even begins creating the rows, if you already have the data and aren't reading as you write it should be even faster but it isn't.
Having the same issue in 2022, it's not efficient at all. Never had this issue when building my own conversion components in JS.
This was quite the lively thread.
For slow Apply to each loops, the answer is often to structure things better and not use them. Or to at least use them less.
Here is a batch update SharePoint template along with links to batch create & batch delete SharePoint:
Then I am working on batch actions for Excel here:
Batch Update: https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-Excel/td-p/1624706
Batch Create Is available with premium HTTP connectors: https://sharepains.com/2020/06/08/super-fast-update-excel-using-power-automate/amp/
Although you could probably also create something similar without premium connectors with Office Scripts, like I did with the batch update: https://youtu.be/4g8Lh0gzEnc
And I’ll probably make a batch delete for Excel with Office Scripts too if I can’t find something soon.
Then use Filter array actions instead of conditionals in loops whenever you can. You can set one for the true condition & another filter for the false condition if you need both.
Use Select actions to select columns, reformat data, etc.
Also @Alan_Sanchez, if you still need something for CSV situations, you can take a look at this Get CSV Data to JSON template (no apply to each loops): https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/CSV-to-Dataset/td-p/1508191
I have just registered to AGREE WITH YOU ON A VERY ADVANCED LEVEL. Every. Fkin. Day. I am meeting the strangest and dumbest errors Microsoft have made. They are not consequent about their systems, making inappropriate changes day by day. "Yesterday" we had "Get rows" which can easily access Excel files from flow, working jack audio ports (without bull**bleep** 3rd party software needed), upgradeable systems (like i have pc which can't upgrade to win11, but doesn't do any errors, but rollbacking without a word said). You can't have tasks (Planner) in private Teams channels, also can't have subchannels. These are just minor, but very annoying things, but I could write it for days. But the worst is that it's uncontrollable. Our company pays more then 2500 USD / month, and I have to deal with Microsoft bull**bleep** every day.
Hello @Anonymous , I'm not sure you are getting what I'm saying here, I agree that as much as possible you should not be doing thousands of iterations in apply to each, and where possible you should use the array controls until you need to amend something in your data source, and even then, you should look at the batch operations in the graph.
You seem to be wanting to start an argument here, I'm trying to make sure that the OP has as much useful info as they can get. I have been authoring Flows for 5 years, pretty much since its release and have personally authored nearly 200 flows for various clients and for the org I work for. I am answering a specific question with a specific problem, the generic answer of check what actions you are using and that Microsoft are useless doesn't help the OP. I have given them something to try based on thousands of hours of experience and not on technicalities.
Also in one of your previous answers, you said that REST calls can only do one request at a time, this is not right as you can absolutely send batch requests to both SharePoint and the Graph:
https://docs.microsoft.com/en-us/sharepoint/dev/sp-add-ins/make-batch-requests-with-the-rest-apis
https://docs.microsoft.com/en-us/graph/json-batching
You can use the Select command to generate your batch requests in blocks of 20.
Use a do while loop, count in 20s
Then do a select statement to build the JSON batch requests
Then finally encapsulate the body in a requests tag and do the call:
This flow took my import run from approx 17 hours per 1000 rows to about 4 minutes. You should also note that I have avoided the use of variables within the loop itself, I have only used one to store the current iteration ID.
If the flow has thousands of iterations in an Apply to each container, then it might not be the types of actions that are slowing things down, but the number of them. I ran into this about a month ago, because one of my flows had a couple thousand iterations per run, and it was scheduled to run every hour. Normally, it would complete in about 5 minutes, but after the second or third run, it started taking several hours, because the flow was being throttled by Microsoft. You might want to review the plan action limits here...
WarrenBelz
146,776
Most Valuable Professional
RandyHayes
76,287
Super User 2024 Season 1
Pstork1
66,093
Most Valuable Professional