Hello,
I have a duplication problem.
I have a flow who do the match between 1 sharepoint list (exit) and 1 excel (Assets) to complete another SP list (automation).
To avoid duplication (if the exit list is the same for 2 weeks for example) i add a condition based on the user.
The problem is, with the condition, i have multiple duplicate (16 by users most of the time) if i submit 2 times the same files.
Without it works but it add duplicate users (normal).
Another problem, if my list is empty, nothing happen if i use the condition (no data added).
My flow below (sorry for the big screen)
no one to help ?
In your case I am not sure. But these guys might be able to help:
ok, but i'm beginner. could you help me ?
i have an exit list (users who leave the company)
a list of all our devices (more than 1000)
i have to match both to know who have what (some can have multiple devices) and put it in another list...
Yes Power Automate will automatically create apply to each of it finds so many arrays. You can reduce them by using variables, composes and using index values if values are static.
Today I made a flow, If I had done it normally, it would have like 15-16 apply to each. Instead I made an array variable, appended parts in it and using index values used create item action without even a single apply to each.
would like, it's power automate who done those apply to each.
Any idea ?
i'm beginner
Because of so many apply to each there are so many duplicates. Try to find a way to reduce them
Someone could explain me why it make so many duplicate ?
Hello,
i agree with your solution but i don't want to use it.
At the moment, because of the duplication, it takes 5 to 10 minutes to do it (without deletion of duplicate) and it could have some duplicate users with differente devices, it's normal.
Hey @666lestat
I know this might affect the performance of the flow. But why not remove duplicates from 2nd SP List.
When above execution finishes maybe you can extend the flow to remove duplicates as well.
Or make another flow to remove duplicates on regular basis.
I hope you know how to remove duplicates otherwise I will give you links.
WarrenBelz
146,524
Most Valuable Professional
RandyHayes
76,287
Super User 2024 Season 1
Pstork1
65,906
Most Valuable Professional