Hello,
I'm completely new to CDS, trying to test it out and I'm coming up against some issues that I find really strange in this day and age of high velocity, high volume data.
https://docs.microsoft.com/en-us/powerapps/developer/common-data-service/api-limits
As described in this link above, I've been hitting a 6000 limit in 300 seconds when trying to transfer when trying to get an initial loading of data into a custom entity in CDS. (the data loading is 6150 rows!)
Between this, and sharepoint's 5000 limit, it just all seems unusable and this is why I'm questioning our understanding of these tools.
Going forwards I have lots of data I was planning to drop in CDS as our organsiation is considering increasing our reliance on Power Platform and eventually also look at Dynamics.
In many cases it is very difficult to understand when data has changed or new rows added to our SQL table, so a complete refresh on a schedule would be a common approach, but until we migrate to Dynamics I was hoping to keep the CDS data up to date each night, to use with powerApps and flows.
Are we thinking about CDS wrongly? as this sort of limitations listed in the above link just make me think it isn't a production ready environment. How are companies.
I should probably point out that our SQL server is in a VM so requires on-premise data gateway to shift the data across. We are a small non-profit org, and the idea of going down a full-on data warehouse sounds a bit OTT and as we move into dynamics, a DWH sounded unnecessary.
thanks for any pointers.