Hi All,
I have built a workflow which gets data from an API and writes to an Excel Online Spread Sheet/ On- Prem SQL Server Database. (In fact, I've built two separate flows, one which writes to an Excel Online Spreadsheet and the other which writes to a SQL server DB.)
For this example, I am including screenshots from the first scenario.
The flow works perfectly fine and it takes about five seconds to extract the data from the API end point and parse it. However, when it writes to the spreadsheet/ database, it took about three hours to finish the entire task. (I have 6611 rows and 5 columns worth of data). It approximately takes 1 second to write across each row so the entire process took about 3 hours to complete.
Is this normal? Is my design flawed? Is it because I'm using the "Apply to Each" control instead of directly adding the fields from the Parse operation to the excel spread sheet?
I tried doing that but the required fields from the Parsed JSON file do not show up in the "Add a row" action until I add an additional step, which is to include "Apply to each" and specify "Select an output from previous steps".
Any advise with respect to this would be of great help!!!
P.S. The same happens for the flow built to write in to SQL server as well. The write action takes about 3 hours.
Best Regards,
VJ
For anyone facing similar issues…
You can now use Excel Batch action templates
https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Excel-Batch-Delete/td-p/1634375
And a SQL batch action template:
Hi @VJ_Nanduri ,
Your configuration is in compliance with the specification and the design is reasonable.
Just because the amount of data is too large, it takes a long time. You can try to set Apply to each like this and see if the execution time can be shortened.
Best Regards,
WarrenBelz
146,645
Most Valuable Professional
RandyHayes
76,287
Super User 2024 Season 1
Pstork1
65,997
Most Valuable Professional