Notifications
Announcements
Hello! I'm trying to speed up the process of fetching more than 400,000 rows from a PostgreSQL table. Currently, I'm using the collections method, which takes around 10 minutes to load, and sometimes it just breaks. I also tried using Power Automate, but it has a limitation of 100,000 rows and takes about 8 minutes to fetch those.
Is there any way to speed up this process?
For example, imagine we have 20 types of shoes, where the only difference between them is color. We want to use these rows as a template to bulk update the price across all 20 shoes. This process will be done on a monthly basis.
Under review
Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.
In our never-ending quest to improve we are simplifying the forum hierarchy…
We are honored to recognize Ajay Kumar Gannamaneni as our Community Spotlight for December…
These are the community rock stars!
Stay up to date on forum activity by subscribing.
WarrenBelz 721 Most Valuable Professional
Michael E. Gernaey 320 Super User 2025 Season 2
Power Platform 1919 268