So I have a bit of an emerging situation that's been causing us grief for the past few weeks and I just want to put it out here to see if anyone else has noticed anything. And I will flag that I've raised this as a SevA issue with MS support but it's stagnating because of a combination of them not being able to understand the situation and I think in part not believing me that it's a real problem.
We have a piece of functionality that imports data from our clients planning files that are Excel format via a custom Office add-in which creates a record on a custom entity, import detail, that is basically a placeholder for a JSON blob that contains the details of a number of other records for another custom entity, advertising plan line item data.
I have an async plugin registered on the Create message for the import detail that processes the JSON and creates the line items. This code has been in place and working flawlessly for the last 4 years. Up until about 2 weeks ago. My client started reporting more content than expected and the resulting dollars were also greater than expected. Investigating that the number of records and dollars was different from sync to sync.
I did a lot of debugging and ended up change the plugin code from a single IOrganizationService.Create method per record to processing the lot of records and performing a single IOrganizationService.Execute(CreateMultipleRequest) instead. In my batch of 30 records this went from a few of the records being duplicated to the number of duplicates always being a multiple of 30. My concern initially was trying to locate if the Service.Create was duplicating the record creation or if the plugin was duplicating execution. Putting in a random number in the plugin execution and persisting that to a field in the DB showed that it was definitely the plugin execution that was duplicating, see the image below of a unique record identifier and two sets of data based on the random numbers (stored in the FinanceID field).
I've had to put my processing logic out to an action and I've got an update action on the create of this record that tries to kill the second plugin executions and then does a retrieve of the record and only process it if it's not being "processed", a status field value, already.
I don't want to have to do this kind of additional processing when I have these sorts of jobs to process. Has anyone else seen anything like this happening in their environments? Is there something I should be aware of and check for?
In terms of the data volume and other points:
- the import details for a single sync have about 240 records containing a single JSON blob
- each JSON blob has the details to create 30 line item records, so roughly 7140 records.
- this in a PowerApps Online model-driven app, not on-prem.
- *edit* this is in the .CRM6. data centres impacting both my dev/QA/UAT environments and my clients UAT/PROD environments
- sometimes it can still process successfully without any issues or duplications
- when there are duplicated data it's not always the same content that gets duplicated, so not an issue with the JSON content. We checked that A LOT.
- we also validated that the JSON content added up to the correct number of records and was formed correctly.
As I mentioned earlier, this has been in place for 4 years and has processed other syncs containing 40K records and had no issues, it's just started happening the last 2 weeks.
<insert help us Obi Wan image here/>

Report
All responses (
Answers (