Hi,
There are a few old posts on collection row limits but as these appear to have gone stale I thought I'd start a fresh version.
I've always worked on the basis that collection size (and any data returned from tables etc.) is limited by the Data Row Limit setting set in the app - the default is 500.
Some context.
I'm currently working on building a PowerApp that connects up to multiple data sources, a legacy on-premise Oracle data base being one (with over 10 years of data). As is typical this is a normalised database and requires several table joins to provide person friendly information.
Bumping the row limit to 2,000 still see many of the tables hitting this limit and unable to display all of the data.
Filtering the data has been a challenge as to get the data needed requires joining six or more tables BEFORE you can filter it (normalised data on steroids). Consequently, you get a lot of data missed/lost.
Switching to Flow to do the heavy lifting I've created a SQL/Query to make the table joins and take a variable from the app to filter it.
This flow is called directly from the app with the Response added to a collection, for display and further refinement.
This all works great and is very fast.
What I'm confused with is that although I have the app row limit set at 500, the count rows for the collection returns significantly over 500, in one case 2,750.
I don't want to manually count 2000+ rows to manually check.
My question is... is this a true count?
If the answer is yes...
Why does the response from a Flow bypass the app row limit?
Is there a maximum number of rows that can be returned this way?
If the answer is no...
Why am I getting a false row count?
Thanks in advance.
I'm glad I could assist you.
Here's an article that explains what can be delegated and what can't.
SQL Server - Connectors | Microsoft Learn
Hope it helps
Hi @DBO_DV
Thanks for your clear explanation, very helpful.
Now I know the row count to be true, I can start playing around with various filters to see what can be delegated down to the SQL Query. I suspect, if you can run the query normally then it will perform the same when triggered in Flow and therefore you should be able to push most filters down to the query and take the load off the app.
This will make working with big data sets much easier... and SQL queries are much easier to modify if you forget something.
Hey @Gary_Eden ,
It is the true count.
What happens in Power Apps when connecting directly from the Data source is (when Delegation doesn't work). Basically Power apps is sending all those filters to the Data source except for the Count rows function. Since it's not Delegated to your Data source it returns 500 rows to Power apps (You could say they are then local in Power Apps) and then Count Rows Count them. Since count rows can't be delegated You'll always get a max of 500 rows.
With Power automate you get this table back from Power automate inside of your app (you could say the table is now local) and then Count Rows does it's work. Power automate gives you a Local copy of the whole table since it's not bound to the Power apps limits and you're basically asking it to it like this withe the respond to Power apps action.
The maximum size depends of your Data source but it is quite easy to work around those limits in Power automate. So your Table could be as big as you want to. But the flow needs to respond to the app in a certain time frame. And if the table is too big you'll have some serious performance issues (Crashing, Lagging,...) So you always need to consider carefully how much data you actually need.
If you've got any further questions don't hesitate.
WarrenBelz
146,524
Most Valuable Professional
RandyHayes
76,287
Super User 2024 Season 1
Pstork1
65,906
Most Valuable Professional