two questions.
I am making a flow which gets data from a Sharepoint list.
It then finds unique departments and for each department I want to fill out an excel file for expenses and then make a total row.
question 1: since I am creating the total row, I probably can't use the parallel processing?
question 2: for each department, I am using filter on the data pulled from Get Elements, JSON the filtered data and then populate excel rows one by one. Is it better/faster to do it this way, or just create a new Get elements with department as Odata filter, and then populate each row?
You may have more luck using a concat & float to add the - like
float(concat('-', string(InsertSumDynamicContentHere)))
There are a few options to send full arrays of data to Excel.
Or
But if you have maxed out the concurrency setting it shouldn't take too long to load roughly 25-55 per department.
Thank you.
What i am doing is creating a accounting document which later will be imported into the Accounting program. So I retrieve all open items from SharePoint list. these expenses are reported via PowerApps.
Then i find the unique departments.
For each department/Store, i create accounting line items per expense. When all expenses are booked in Excel, I create a new line item with credit sum amount of all expenses for the store.
I have kept the filtering solution rather than "Get new elements". I have 270 departments/stores, and approx. 7-10.000 expenses in total for all. I have also used your Sum solution to quickly sum per department.
I now have two issues.
1) i need to multiply the sum with -1. it works if there are no decimals, else it gives error that mul expects a int. I have tried to use mul(float(outputs('Formater_tall')?['body']),-1), but still same error.
2) the update of Excel row by row is really slow. Not sure if there is a way to add all the information in an array and then paste all the information?
@MegaOctane1 You’re on the right track.
As long as you have less than 100k rows and want to do this for every department, then it’s more efficient to just load the entire dataset in one Get items, then use the Filter array action to repeatedly grab data by department from that Get items data. Which sounds like what you described.
I’m not sure how many expenses per department you may have. If it’s only a couple dozen per department, then adding one excel row at a time is fine & likely easier to read/follow when reviewing the flow.
I’m not sure what you are using for summing all the expenses per department though. If you are doing any looping for that, then using something like this set-up (https://www.tachytelic.net/2021/06/power-automate-instant-sum-array/?amp) may be more efficient.
But you may also want to use an Office Script instead if you want more standard Excel total formatting.
WarrenBelz
146,651
Most Valuable Professional
RandyHayes
76,287
Super User 2024 Season 1
Pstork1
65,997
Most Valuable Professional