I'm creating a flow that will read a CSV file received by mail and add it into an excel file.
The thing is the CSV can be massive and the flow ends with a time out. I've read this about setting the time outs but my issue is with the apply to each loop, the processes inside the loop are quite quick all things considered but since the CSV is massive the flow fails.
Is there any way to sort this out? I've considered splitting the CSV but there must be a simpler way of changing the timeout.
Thanks.
@Anonymous ,
For 500 lines you can try to set manually the concurrency setting for apply to each action. The execution time will be significantly reduced. … -> Settings -> Activate Concurrency & set to 50.
Hi,
The one I'm working at the moment has ~500 lines.
It takes about a second per line (extracting the content and insert into the specified excel file for later work) so it times out after 450 lines (about seven minutes and a half).
My company has not authorized licensing so far so I'll have to split the files.
@Anonymous ,
Can you define what a massive csv file means to you ?
On a first view, I don't think that PowerAutomate is suitable for this kind of tasks. I said that because based on PowerAutomate license (performance profile) the number of actions is limited on a daily basis.
Also the Apply_to_each has a limited number of loops based on license you have, that's why the flow actually fails in present.
So, most probably your flow will work in actual form if you have a proper license, but be aware of the limits.
Hope it helps !