Hello! I'm trying to speed up the process of fetching more than 400,000 rows from a PostgreSQL table. Currently, I'm using the collections method, which takes around 10 minutes to load, and sometimes it just breaks. I also tried using Power Automate, but it has a limitation of 100,000 rows and takes about 8 minutes to fetch those.
Is there any way to speed up this process?
For example, imagine we have 20 types of shoes, where the only difference between them is color. We want to use these rows as a template to bulk update the price across all 20 shoes. This process will be done on a monthly basis.
WarrenBelz
146,524
Most Valuable Professional
RandyHayes
76,287
Super User 2024 Season 1
Pstork1
65,906
Most Valuable Professional