We have a Flow which is pulling a table from our on-prem SQL with 2550 records yet when the Flow adds them to a clean list, it only imports 2048 records. Has anyone ever seen this or am i missing something obvious? Confirmed count in table, confirmed complete flush of data from list and confirmed final count after run.
We have tried lower counts and they work properly - i.e. 1,000 records works fine. It doesn't seem to be timing out at about 34 minutes to run.
If anyone else has a robust solution though, would love to hear it
Make sure pagination is off, it sets a minimum record count which will mess with this process:
I've tried this and found that with pagination on and threshold set at 5000, the connector gets the correct number of records, but ends up missing some from the original source and duplicating others instead. Very odd. After a fair bit of tinkering I've surrendered and split the data on the SQL server side into 2 views for separate import. If anyone else has a robust solution though, would love to hear it
Hi CivilNJ,
I have made pagination and set limitation to 10000 even tough not able to get all records
Hi CivilNJ,
I am afraid that there might be a limitation for the records that action “get rows” could get. We are able to get the rows up to 2048.
Please consider to create a request at Flow Ideas Forum to expand the total number that action “get rows” can retrieve.
https://powerusers.microsoft.com/t5/Flow-Ideas/idb-p/FlowIdeas
Best regards,
Mabel Mao
Tomac
986
Moderator
stampcoin
699
Super User 2025 Season 2
Riyaz_riz11
577
Super User 2025 Season 2