I am trying to parse a large CSV file (about 100,000 rows) but am having trouble getting the flow to complete. My first iteration of this flow was built with all "Compose" actions instead of variables so that concurrency could be turned on, but even with the concurrency turned up to 20, the flow was still not completing in over 24 hours.
For the second iteration of the flow, I started using variables so that I could process the CSV in batches of 5,000 rows using the take(skip()) method:
take(skip(outputs('SplitByNewLine'), variables('ChunkMulitplier')), variables('BatchSize'))
For each row that gets processed, I increment a "NumChunksProcessed" variable.
ChunkMultiplier = NumChunksProcessed * BatchSize, where BatchSize = 5,000.
I have the "Apply to each CSV" control inside a "Do until" control. The "Do until" condition is process these batches of CSV rows until NumChunksProcessed = number of CSV lines divided by the BatchSize. However, the steps inside the "Do until" control only run twice, even though they should run 20 times (100,000 / 5,000 = 20).
Does anyone have any advice on the processing large CSV files in chunks? Is 100,000 too many rows for Power Automate to handle?