Hello!
Over the last year I have been able to learn so much from building flows! Normally, I have been able to figure out my issues, however, this time I need help.
VISION/PLAN: Take a large raw excel file(1,000+ rows, 100 columns, and not all cells have values) exported from a parent system, place it in a document a library, transfer data to a static Excel file(master file), then update MS lists.
WHAT I HAVE DONE: Built a 3 phase flow
Phase 1) When the raw Excel file is placed in the document library, a flow is built to convert the range of values into a table. Once converted, the file is placed in another document library to trigger phase 2.
Phase 2) Compare data from raw excel to master file using a filter array to update/add rows from raw data to master file. The master file takes the data and ensures the all formatting is set so there are no issues when transferring data to MS Lists. The raw excel file is then moved into its final resting place in a 3rd library, to trigger the final phase.
Phase 3) Take data From master file and update MS lists accordingly. The raw data is then archived.
Issues: all phases have been successfully built and the information flowed just as expected, however, when I try to transfer data in Phase 2 the flow slows down considerably after about 800 rows. My pagination is set to 5000 and concurrency is set to 50 on the 'Apply to Each' portion (add/update rows) my retry policy is set to default. I would also occasionally get a 429 error, so I added a delay of 30 seconds within the 'Apply to Each'.
I've even brought concurrency down to 30 and still it is taking way longer than it needs to.
Does anyone have any tips on how i can improve my flow in phase 2? Make it more efficient.. (note, I do not the office script option, so I can't run that)