I think it's an important limit of classic workflow monitoring. Mi ideas was:
1. Extract in Excel all system jobs (workflow sessions) for one entity, for a time range
2. Estimate in Excel the executions peak, the average daily run per workflow
3. Esitmate the number of actions counting the number of steps for each workflow and per each daily run execution
4. Estimate the average Web Api call per daily Wokflow Execution counting = CRUD Wkf Steps operations + 1 (the Wkf trigger) + Childs Wkf call/Action call + CRUD operation inside Workflow custom Activity Plugin
Otherwise, it's difficult to estimate the throughput and the # MB reads data / 5 min!
I think the only solution is:
* the forecast estimate mentioned above
* replace the logic of the workflows developing optimized modern Power Automate flows (pay attention to MS best practice) and monitoring them with a strong Data Quality Test and User-Acceptance-Testing
* After Go-live developments monitoring Power Automate flows using CoE and Dataverse Analytics
What do you think about it? Do you have any suggestions? Alternative solutions?