Hello All,
I am using the 'Run a query against a dataset' connector, and last week I was able to get all 67,000 rows (10 columns) back using this connector, however when using it now this week I am only able to get 58,000 rows (roughly 10,000 less than before) no matter what I do. The connector obviously is able to bring the amount of rows I need back, but I am not able to figure out how to get it to work again. I have tried creating another flow similar to this, I have tested that the query and it shows that the number of rows is correct, it just doesn't seem to work when entered into power automate.
Has anyone else experienced this and if so how did you fix it. I need all rows to come back at once and to my knowledge I am not hitting any limits.
Please help!
Emily Pollard
How did you limit the batch size? Was it in query itself or through power automate flow?
Thanks, Now it works. After limiting the batch size 2000 rows, now it is able to grab all the columns.
It works regardless of the amount of column, but you may need to reduce the batch size. In your case, a batch size of 2000 rows would work.
I think this video solves grabbing rows more than 100K. It seems possible if we have less columns (5-15). But if we have more than 50/60 columns than 'Run a Query against dataset' does not show all the rows. I have a dataset around 5000 rows, If I add all the columns (more than 50), I only get 2227 rows.
Do you have measure column? If so you have to define the measure when running the query.
Thanks. I saw that work-around, and planned on using it if folks still want the larger dump. Seemed a lot of hoops to jump through.
But thanks for the response!
we fixed it, check out this video:
Best I can tell is that there is some total data limit in the "Run a query against a dataset" call. If I add more columns (more data), I get less rows. I've been searching, but I haven't seen any documentation that speaks to the limit, whether it can be changed, etc.
I even looked at the raw JSON output of the query step, and its records were limited. (That is, in my case, it wasn't the "Create CSV table" next step that was causing the limit.)
I have the exact same issue in multiple flows, I have no idea what is causing this