Hi @Expiscornovus. Thank you for your informative reply.
How much data/rows per Excel file would we be talking about in this scenario? If this is a lot of data I am not sure if Power Automate cloud flows are the best option to meet this requirement.
Each table will have a couple of hundred of rows. However, there will be other projects that I will work on with millions of rows of data.
You might want to consider using the out of the box import feature (which supports upserting), data flows or even the upsert method from the Dataverse Web API.
Thanks for mentioning the dataflow. Regarding the dataflow method can I set it up so that it works with what I'm doing. So can I have it constantly syncing with my designated OneDrive folder where all of my Excel files are stored so that if a row is edited, deleted added it will automatically trigger the dataflow. I've never used dataflows before so I'm not too familiar with their capabilities. Is there a limit to how often it can run in a day? does it work on schedule or can it be triggered?.
You mentioned the upsert method from the dataverse web api. I've heard this terminology used when talking about delta lakes however I'm not sure how it works in dataverse. How does this differ from dataflows?.
Regarding the image you kindly shared, where you have "List rows present in a table" and underneath it you have "Table" what should I do if I don't want to set a table name. I'm assuming this prompt is asking me to enter the name of the table so it knows what to look for but what if this changes or the naming system isn't consistent. Is there a way to allow room for this?.
Thanks in advance.