My sincere request to the community here, is to share your thoughts on the scalability and related best practices of Power Automate Cloud Flows. People who are trying to use the low/no-code approach in real-world commercial implementation involving 1000s of DAUs and high concurrency situations, need to know the capability and limitations along with the licence requirements and cost implications for the same. This will help us make better recommendations and help customers make informed decisions. I have checked and the official documentation on this is quite limited and lacks details based on actual industry experience. So, if we can have a comprehensive discourse, it will help and work as a guide for the industry practitioners worldwide.
A sample practical problem here : A cloud flow takes the input as a base64 encoded string and appends the binary transformation to an append blob. The reason it appends is because it is called multiple times for a large file upload, after splitting the file into chunks less than 50MB in size by the calling party, to avoid stripping at the http input trigger. Now, if 1000 concurrent users are going to upload such files, for which the same flow is going to be called 10-50 times per file, how can we make sure it scales ? Is it even a good design and should a Power Automate flow be used in such a use case ? We need to know the actual practical limits of using Power Automate Flows in terms of practical scenarios and not just by request limits, size limits etc.
Any help will be instrumental to the industry practitioners.