Is there any way to scale up the compute power used to process dataflows? Dataflows can get quite complex and can process pretty large amounts of data. As such, many of my flows can take two or three hours to load into Dataverse.
It would be great if we could pay to "scale-up" the compute power, a little bit like you would do with a Virtual Machine in Azure. Yes, yes, I know, Dataverse isn't infrastructure as a service (IaaS). Nevertheless, bottom line, we would be willing to pay to process our dataflows faster.
Is there any way to do this?