Dear all,
I have an external data source that is being updated regularly (every few minutes during a workday), from which I get its data through a REST API.
Currently I am fetching all the data of the current day with a Standard Dataflow, every 5 minutes. This means that I read the same (hundreds of) records each time, plus any additional records that are created during the day. My dataflows will CRUD these records into multiple (standard and custom) entities in my Dataverse.
It seems that this method is causing a lot of overhead on the system, and that these dataflow executions are getting beyond the available capacity and are being slown down after a while...
Should I try to convert these Dataflows into Power Automate flows that are only triggered when data is changed in the source data (if I somehow can detect that)?
I start wondering why one should choose either a Standard Dataflow or a Power Automate Scheduled Cloud Flow? In this case, they both achieve the same, namely reading an external source and copying the data into Dataverse...
Could anyone shine a light on the "best-practices" on choosing between Dataflows and Cloud Flows?
Thanks,
Koen