web
You’re offline. This is a read only version of the page.
close
Skip to main content
Community site session details

Community site session details

Session Id :
Power Automate - Building Flows
Answered

Help please - Comparing Datasets and Arrays and struggling

(0) ShareShare
ReportReport
Posted on by 20
Dear all, 
 
I am trying to build a new and improved workflow for a table that I have. Essentially right now it works by running once a day, and essentially deletes everything from the table and then I run the new HTTP API GET URI to capture my new data through a PARSE JSON and Add a new row. Easy and that works a treat.
 
However, as part of ongoing imporvements I need to enhance it, as its proving not practical as the same table is being used as Lookup values in another. So whenever it get deleted, the other tables values disappear which is completely normal right. 
 
So I currently have a Dataverse table, here is a sample, lets call it DatasetA for which I use 'List rows' Action to pull out the data:
 
Worth noting I have normalized the data objects to keep things straight forward.
 
Currently has 50 rows 
 
id (unique identifier)  ff9efa709974441b9e1282ab408dca3a
name Dark Matter
description Dark Matter - Cluster
rowid 4da7b19c-7fbc-ef11-a72f-002248c764d2
 
Sample JSON Data
 
{
"id": "085d4d1df3a744e3bfa0580f9c37c30b",
"name": "Dark Matter",
"description": "Dark Matter - Cluster",
"rowid": "4da7b19c-7fbc-ef11-a72f-002248c764d2"
}

And a HTTP Output, here is a sample, lets call it DatasetB for which I use 'Parse JSON' action to pull out the data.
 
id (unique identifier)  ff9efa709974441b9e1282ab408dca3a
name Dark Matter 2
description Dark Matter - Cluster 2
 
Sample JSON Data
 
{
"id": "085d4d1df3a744e3bfa0580f9c37c30b",
"name": "Dark Matter",
"description": "Dark Matter - Cluster"
}
 
Now you will notice that DatasetA as rowid and DatasetB does not, and in reality this rowid is needed for when I need to complete Actions like 'Delete' or 'Update' and its unique to the Table but my keys is 'id'
 
Either way, I the end goal is simple enough but I am struggling to get it to work as expected. I essenitally want to only add, delete and update as of when (incremental) updates.
 
ADD From DatasetB not contains 'id' in DatasetA then add a new row with all the values.
DELETE From DatasetA not contains 'id' in DatasetB then delete row by using 'rowid'
UPDATE From DatasetA 'name' or 'description' not equal DatasetB 'name' and or 'description' then update with DatasetB values.
 
So I have been trying to do these for the last week, and I struggle with arrays period, but everything I try to do any of the actions above, it for ever adding 'For each' this or 'For each' that and then the problem I find is that I end up having a bunch of nested loops and it's not doing anything. The one I struggle with the most is the 'UPDATE'.
 
I have watched several youtube videos and what not, but when it comes to the UPDATE this is where I struggle the most and just for the life of me cannot get it too work.
 
Can anyone suggest me the easiest and best approach to take when comparing datasets and outputs like this and what steps are the most efficient manner? To date, I have the following, but as I will be using this with table of anything from 50, 100, 1000 rows, I wan't something that is quick to essentially identify what does not contains or not equal to
each other and then where it does or not then to complete the necessary action, rather than it having to go through a For each row in the the table and check everything once. 
 
Hope that makes sense :) 
 
Currently I use and export my arrays and objects using 'Select' action, but I don't like it because unless I have 'rowid' in both Datasets it does not provide the correct values in return.
 
 
I don't like that the Datasets and the number of Objects have to be like for like otherwise it will not compare the data. Is there anything else I can use ust to filter my List rows on 'id' and where it matches 'id' in HTTP Output then do action rather than storing everything in arrays and comparing it that way? 
 
 
Currently I do not have a working UPDATE flow, hence its disabled at the end of my flow, but would appreciate if someone can help be re-design this as its not working as efficiently as it could and I am not sure what I don't even makes sense. 
 
 
I have the same question (0)
  • Verified answer
    MrGott Profile Picture
    101 on at
    Help please - Comparing Datasets and Arrays and struggling
    Lets break it down to two different problems.
    1. Upsert your Dataverse Table with the new Values from your HTTP-JSON-Dataset thiny
    2. Delete all values from the Dataverse Table, that are no longer in the HTTP-JSON Thingy
     
    First Problem:
    When looping over all data from your HTTP JSON Thingy:
    Use the Dataverse "List Rows" Action and leverage the Filter with the by your unique identifyer.
    Use the length function on that output and use it for a condition:
    If the number of rows = 0, then insert your json
    if it is greater than 0, you want to update those rows.
    For the "Update a row" action, you can use the rowid (dataverse id) from the List rows funtion.
     
    Second Problem:
    Deleting the old values from the Dataverse Table could be achieved in many ways.
    a) For example using "list rows" with a filter on the modified field earlyer than the Workflow Start Datetime
    b) Looping throu all Dataverse rows: For each Row you filter the HTTP Json Thingy with the id (not the dataverse id) and check the length of the output -> if it is 0, you can delete that row
    c) before doing everythin you set a boolean field "toDelete" to true, then you do processing and when updating or inserting, also set the toDelete field to false. After the Processing you can just delete every row that still has the toDelete field set to true
     
    hth
  • adrienricoux Profile Picture
    20 on at
    Help please - Comparing Datasets and Arrays and struggling
    Hi @AM-13091535-0 Thank you for the response. 
     
    So I definitely learnt something new and I have managed to resolve both problems now thanks to your solution and using the methodolgy of your toDelete column function :) 
     
    So now I have the following.
     
    1.  Schedule run
    2. Authenticates to environment
    3. Export JSON Thingy
    4. If the table is NULL then
      1. Add New rows with the 'todelete' set to True else
      2. Update rows with 'todelete' set to True
    5. For the Add and Update, I followed your instructions and that works perfectly
      1. Add a new row if Condition equals 0 and 'todelete' set to False
      2. Update row if Condition equals 1 and 'todelete' set to False
    6.  For the Delete
      1. Filter on 'todelete' eq true
      2. Then delete those Row ID
     
    For the delete, I kept it simple. Since there was no need for me to integrate the JSON thingy, since in principal the Add and Update would have covered all those steps which it does gracefully, so I just deleted based on what was left in my table that equated to true.
     
    Thank you for finally bringing an end to my pain for the last two weeks. I will certainly celebrate todays achievements thanks to you. 
     
     
     

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Responsible AI policies

As AI tools become more common, we’re introducing a Responsible AI Use…

Chiara Carbone – Community Spotlight

We are honored to recognize Chiara Carbone as our Community Spotlight for November…

Leaderboard > Power Automate

#1
Michael E. Gernaey Profile Picture

Michael E. Gernaey 597 Super User 2025 Season 2

#2
Tomac Profile Picture

Tomac 373 Moderator

#3
chiaraalina Profile Picture

chiaraalina 262

Last 30 days Overall leaderboard