web
You’re offline. This is a read only version of the page.
close
Skip to main content

Notifications

Announcements

Community site session details

Community site session details

Session Id :
Power Platform Community / Forums / Power Apps / Bulk Updating Data in ...
Power Apps
Answered

Bulk Updating Data in Dataverse Table

(0) ShareShare
ReportReport
Posted on by 42

Hi all I've come up against a problem.

We recently changed the structure of one of our data tables. This worked perfectly fine. We have basically gone from a table with a smallish number of rows but lots of columns to a table with lots of rows but a smallish number of columns. This new table has approx 500k rows.

 

I used to update the data in the smaller table by just linking Access to the table and writing some update script. Doing this on the larger table "works" but I get intermittent memory issues and the update will not always complete.

 

I know you can update using the "import template" in the advanced section of the environment so using a template like this.Capture.JPG

 

This works in principle but for this to work I need to be able to get to row checksum for each row in the table. The template only provides the top 20 rows or so. As I could be updating any of the 500000 rows this wont work. I was hoping this was an available field in the dataverse table but apparently not as I cant see this in the table or by using power query to try and extract.

Any ideas? Or am I missing an easier way to update.

 

I have the same question (0)
  • a33ik Profile Picture
    3,306 Most Valuable Professional on at

    Hello,

    I would recommend using something like SSIS/ADF/DataFlows.

  • dazzerd Profile Picture
    42 on at

    Thanks dataflows works using the actual GUID which I can access, problem is the dataflow updates 1 row at a time... when I need to bulk update 1000+ in one go. Must be some way to update from a spreadsheet then import in bulk.

  • ChrisPiasecki Profile Picture
    6,422 Most Valuable Professional on at

    Hi @dazzerd,

     

    The export to Excel will have a limit of 50k (or 100k?) rows, in your case the volume is a bit too high to effectively leverage the out of box export/import.

     

    I believe Dataflows have been significantly improved and will handle all the optimization of batching/bulk updating under the hood. 

     

    If you want to have full control over how the data is loaded, you can use Azure Data Factory or SSIS like @a33ik suggested.

     

     

    ---
    Please click Accept as Solution if my post answered your question. This will help others find solutions to similar questions. If you like my post and/or find it helpful, please consider giving it a Thumbs Up.

     

  • Verified answer
    dazzerd Profile Picture
    42 on at

    Yup you can do this in power query to extract the guid of the record and then dataflows/power automate to update fairly easily and much quicker than using linked tables. Still need to figure out how I update a null value to a date field but thats a different problem.

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Forum hierarchy changes are complete!

In our never-ending quest to improve we are simplifying the forum hierarchy…

Ajay Kumar Gannamaneni – Community Spotlight

We are honored to recognize Ajay Kumar Gannamaneni as our Community Spotlight for December…

Leaderboard > Power Apps

#1
WarrenBelz Profile Picture

WarrenBelz 721 Most Valuable Professional

#2
Michael E. Gernaey Profile Picture

Michael E. Gernaey 320 Super User 2025 Season 2

#3
Power Platform 1919 Profile Picture

Power Platform 1919 268

Last 30 days Overall leaderboard