I want to transfer the audits data from dataverse to kusto through scheduled access to the audits api. I access the dataverse api through scheduled tasks to get the latest data. I use this api for example:dynamics.com/api/data/v9.1/audits? $filter=createdon
ge 2024-04-06T14:12:09Z and
createdon
le 2024-04-06T14:18:08Z and objecttypecode eq 'TABLENAME' and (operation eq 2 or operation eq 1) and attributemask ne null&$orderby=createdon asc to get the data, but I later found such an api to get it The data seems to be less than the actual data, and some data may be lost. For example, it is found in the log that this API returned 10 pieces of data, but after a few hours, I accessed this API again and the data obtained may be 20.
How to ensure that all data will be obtained through the api?
If you're building this for the MSIT or FastTrack group consumption you could perhaps talk to the product group and see if you can access to to the underlying SQL and get this data directly bypassing the APIM.
Hi @bingo6
What you are trying to do is in general not advisable, audit records are created automatically and they can be 1000s of them getting created depending on the usage of your app, if you are trying to get all of them and dump it into kusto for some other integration or analysis, its a bad idea, here is why
1. Dataverse platform has service limits, trying to read these many records every now and then you will definitely hit some or other service api limit and it will result in the slow down of your application for production users or result in problems elsewhere like power automate's not triggering etc.
You can read about service limits here : https://learn.microsoft.com/en-us/power-apps/developer/data-platform/api-limits?tabs=sdk
2. Trying to keep audit in sync with kusto is going to be a challenge no matter how well its built as audit records are created by the product automatically as changes happen so at any given time your kusto will always be behind dataverse. You cannot have direct triggers on the audit table afaik.
Here is how I would propose you tackle this problem, I am assuming what you are trying to do is have this in kusto either for some kind of analysis or some other integration like for eg with ICM etc.
For ICM, there are other better solutions available, IM me on internal teams (send me a private message and i can share my MS id with you)
For other analysis/integration I would recommend you try this way.
Audit is stored at each record level. There is an internal action "RetrieveRecordChangeHistory" which can be called in this way
Notice above that this is a paginated query and at a record level not the entire audit table level.
Use this action to get targeted audit records. It will be best if you can pull only for records which are needed and when its needed instead of trying to have everything all the time.
Another option if you must pull this will be to look at Azure Data Factory, this has the audit table exposed for a dynamics connection. You should be able to build a simple pipeline to pull this info via ADF to Kusto/Cosmos
Sorry but when you say problem, you mean delay?? As far as I know, you can't control delay. It just depends on the system load.
Thanks. But how to understand the problem I'm having since I can't think of other possible reasons. And do you have any suggestions? I just want to regularly obtain and fill the audits data of dataverse into kusto.
Dataverse auditing is an internal feature heavily used for compliance, only below messages are supported, as you can see only read and delete supported. Delay might be happening because the backend system might be loaded, but order should be guaranteed IMHO because it's a compliance feature.
Thanks. I posted another questionAre the records in the audits table inserted in or... - Power Platform Community (microsoft.com). I guess this may be related to the insertion delay and order, but I am not sure about this, so I need to ask everyone here. For example, if the other 10 pieces of data are inserted after a while, then this will cause the problem I mentioned.
I'm reading again your description, you first found '10' records then '20', which is exactly how it should be, because any new updates are going to add to the audit table, are you observing that the audit records are increasing or decreasing, or it's random? if they are fluctuating then that's not right.
Hi, thanks. Maybe there are some misunderstandings here. I just use this table as an example. It can be any table. Please don't pay attention to whether it is the user table.
the schema name of 'User' table in dynamics is 'systemuser', is this a custom table??
mmbr1606
22
Super User 2025 Season 1
stampcoin
17
ankit_singhal
11
Super User 2025 Season 1