Hi everyone,
I have a flow that takes rows from an Excel table and inserts it into a SQL database in Azure. The inserts into the SQL database are happening but at an extremely slow rate. For 256 rows to be inserted it took 14 minutes to complete. I have a column that sets the time when the record is created and there you can see well that the inserts are very slow. Is there anything that can be done to speed this up?
You can also try using this batch SQL template:
Hi JoostPielage,
Could you please let me know how you added Account Name, First Name and Surname? No individual column is available in the output of 'List rows present in a table'.
Hi, Give an example how is the query in sql to insert the OUTPUT.
I thought something like :
INSERT INTO Employee
SELECT *
FROM OPENJSON({OUTPUT}
Hey @Anonymous
How many rows are you processing?
I had 5k+ rows so I used a do until loop and a couple of variables, condition logic to process all records.
I am using execute SQL procedure operation just after compose JSON output. So, it inserts 5k rows per batch. I am incrementing the values using variable SkipCount and using that in advanced properties of Excel Get list by rows.
Let me know if you have any questions
Hi @Pranshu27,
Please let me know once you will post your blog article. I have exactly the same approach as you described - 5k rows portions that take 2 minutes each to load to SQL. Would be great to compare and see if there is still something to improve in my pipeline.
Thanks a lot for this approach. I was able to use this to get desired output. Previously It took me 1 hour to insert 5k rows. Now, I am able to insert 5300 rows within 2 minutes... I am taking the JSON output from Compose and used it as an input to stored procedure which inserts records in the destination table using OPENJSON. I used Variables and do Until loop to process more than 5k rows which is a limitation in MS Flow. I will definitely write a blog about it.
Hi @Anonymous ,
Thanks for the examples - really appreciated! I will have a try and see if I can make it work for my flow...thanks again!
Richard
Hi @RichardW ,
I basically store the JSON output from the compose into a SQL temp table.
INSERT INTO @TheTable SELECT '@{outputs('Compose_5')}'
SET @JSON = (SELECT TheJSON FROM @TheTable)
After that, you can use the JSON to insert the values into a table. You need to make sure that your column names match up. You can see an example here:
If you have any question let me know, I;ll try and help where I can
Hi @Anonymous ,
I am glad I found your topic as I am also struggling to insert 50k+ rows from Excel to SQL Server using flow...Could I ask how do you use Compose output in your SQL query to insert? Do you mind providing a quick example or some documentation as reference? BTW our SQL Server is on-premise so I will probably wrap the query in stored procedure. Thanks for your help in advance!
Richard
Hi @Anonymous if you go to the settings of you excel step. You can enable pagination and go past the initial 5000 row limit. The ultimate limit is 100k. I have never tested that. I have gone up to 52k without a big issue. Good luck on your flow!
Tomac
986
Moderator
stampcoin
699
Super User 2025 Season 2
Riyaz_riz11
577
Super User 2025 Season 2