Please note
- I did not build this app, thereby my responses may take a bit longer than usual.
- Dataverse table 'Pricing' holds 1.7m+ Rows.
- I cannot provide certain data or tables due to sensitivity of data.
Overview
I have a flow which in short looks at pricing parts based on a service bus message.
The app has been used for over a year now and the size of data which the filter is operating on increases as time passes.
The issue is the filter always outputs an empty array. I have manually checked the tables and the output should exist.
Furthermore, this is a Production only issue. The exact same process is followed in UAT for the same Parts and it executes as expected.
The only difference being the amount of data being filtered.
Initially the issue only occurred for one part, however it is now causing multiple failures as more parts are added into the data table.
The data is needed in the table for audit purposes hence why it is not being deleted.
The filter can experience between 1 to 50k+ records going through it at one time.
Questions
Is there a limitation on the number of rows a filter query can process in power automate?
If the above is yes, Is there any way to mitigate this issue i.e. archiving/deactivating records?
Screenshot
*Outputs in 'Filter Rows' is a composed filter query done previously in the flow i.e. xyz eq GUIDvalue
Filter Query
@and(equals(item()?['prefix_Table/column'], variables('RequestMessageObject')?['item0']), equals(item()?['prefix_Table/column2'], variables('RequestMessageObject')?['item1']), equals(toLower(item()?['prefix_Table/LookupTable/column3']), toLower(variables('RequestMessageObject')?['item2'])), equals(item()?['prefix_Table2/column4'], 'true'))