I am running "Get Items" with pagination set to 100000 items on a large list but I only really pull data back in the 50k most recent items so order by "Created desc", I found if I did not do this I only get a 10th of the items. I believe this is correct functionality as it only scans the first 100k items. Ordering by "Created desc" resolves this initial issue.
Manually exporting the list I can see there are 1336 items that fit my filter. When I run the flow I see the following if i count the rows:
I run the exact same flow again with no change to the back end data and get this:
If I run it again I might get the other result again or this again or a different number. The items is sometimes misses meet the filter criteria so should be shown. It seems random to how many items it is able to return which gives me very low confidence. This is not an isolated scenario, after many tests I found the same issue occurring. Even when reducing the filters to just one I has the same problem of different quantities of data returned.
As an alternative I am using http get request on a continuous loop then filtering the items, but this is much slower (taking +5 minutes) than get items (less than 1 minute).
Any thoughts on why this is happening or ways to resolve it would be greatly appreciated.