Announcements
I am working with custom tables for a project and it involves updating the data daily, but whenever I run the job to refresh the data it duplicates all the existing records. Is there a way to automatically delete duplicates when the import happens?
Hi @Anonymous ,
You should setup duplicate detection rules to better detect them. If you are using the Data Import Wizard, you can set it to prevent duplicates from being imported.
Additionally, you can set up Alternate Keys on your table to ensure uniqueness amongst a record using 1 or more columns. If you try to create a duplicate in the system based on the key, then the system will prevent it at the database level.
---Please click Accept as Solution if my post answered your question. This will help others find solutions to similar questions. If you like my post and/or find it helpful, please consider giving it a Thumbs Up.
I set up the duplicate detection but it only notifies the user about the duplicates and then I have to manually decide what to keep and what not to keep. Is there a way to set all duplicates found to be deleted?
Hi @Anonymous,
You can setup a Duplicate Detection Job to detect all the records that have been identified as duplicates. You can then setup a Bulk Deletion Job to delete the records based on the criteria you specify.
There is also an MS provided code sample on how to do a bulk detection and delete in one go in a programmatic way if desired.
Hi @ChrisPiasecki, would you mind to elaborate on how to setup the bulk deletion job? I understand how to create a job that deletes ALL items based on a filter (e.g. field X contains Y). But how to remove the duplicates only and keep ONE item?
Thanks!
@Anonymous @NyomanLukas @ChrisPiasecki
You could also try this custom template for finding & removing duplicates in any Power Automate datasource: https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Find-and-Remove-Duplicates/m-p/2191403#M1611
Hi, It won't let me set duplicate detection on custom tables. It just does not show custom tables in the list where you are setting up a new duplicate detection job. Any other ideas?
Under review
Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.
Congratulations to our 2026 Super Users!
Congratulations to our 2025 community superstars!
These are the community rock stars!
Stay up to date on forum activity by subscribing.
WarrenBelz 493 Most Valuable Professional
11manish 479
Haque 328