web
You’re offline. This is a read only version of the page.
close
Skip to main content

Announcements

News and Announcements icon
Community site session details

Community site session details

Session Id :

Batch Update, Create, and Upsert SharePoint Lists

takolota1 Profile Picture Posted by takolota1 4,980 Moderator

SharePoint Upsert Thumbnail.jpg
SharePoint Batch Update, Create, & Upsert Template

(Also an option for full dataset synchronizations with the "Full Sync" template below) 


Compared to the using the basic Apply to Each, Get items, & Update item approach on large amounts of data, this method requires a small fraction of the API calls towards the daily action limit and drastically reduces flow run-times.


By default it is set to take data from any Excel sheet and update records in SharePoint with corresponding data. It works with any potential Power Automate data-source, including HTTP API GET requests, but Excel provides a simple example.

Part of it works like a Vlookup function where it identifies if a row of updated data in Excel or another datasource matches another SharePoint key column and gets the SharePoint record ID for that match. Then it uses the batch update method to update those SharePoint records and it uses the batch create method to create new items for any records without a match.


 

Helpful related resources

David Wyatt's Flow Optimization Post For Loading SharePoint Records: https://www.linkedin.com/pulse/top-5-ways-optimize-your-flows-david-wyatt/?trackingId=X9bMmnTZ2QBuu48HJ29wQA%3D%3D ​​​​​​​

Microsoft Batch API Documentation: https://learn.microsoft.com/en-us/sharepoint/dev/sp-add-ins/make-batch-requests-with-the-rest-apis

TachyTelic.Net Blog & Videos
SharePoint Batch Create Flow
Blog: https://www.tachytelic.net/2021/06/power-automate-flow-batch-create-sharepoint-list-items/ 
Video: https://youtu.be/2dV7fI4GUYU

SharePoint Batch Delete Flow
Blog: https://www.tachytelic.net/2021/06/power-automate-delete-sharepoint-items/ 
Video: https://www.youtube.com/watch?v=2ImkuGpEeoo

 

 

(Old) Update V1

Version 1 Explanation Video: https://youtu.be/l0NuYtXdcrQ 

Download The Template Batch Update Flow

Google Drive Link to Flow Zip Files: https://drive.google.com/file/d/10gFkycdx6zpRfrI-s_jCDwIK6dpyyDqk/view?usp=sharing

Google Drive Link to Text File to a Scope Action Containing The Flow: https://drive.google.com/file/d/1e6-INUykIT22ppVh5m4kxz8us_7qXy7q/view?usp=sharing

Formulas For Random Number Columns
SharePoint Rand1To50 Column Calculated Default Value Formula:
=INT(INT(RIGHT(NOW(),2))/2)
Excel Random Column Formula:
=ROUNDDOWN(((Rand()*100)+1)/2, 0)

(Old) Update V1.5

This version makes it easier to handle cases where the list name may have changed since its creation and moves a few of the primary key column matching inputs to the 'settings' compose action so users don't have to look through & edit the more complicated expressions to set up the flow.

The flow can easily expand to any size of SharePoint list by adding more Get items actions and batch update parallel branches. If speed is a concern for anyone, there are ways to make the Get items actions all load in parallel too (up to 50 branches). It's really only limited by how much effort people want to put into their flows & lists.

Google Drive Link to Flow Zip Files: https://drive.google.com/file/d/10p7EB730xsEj-azVYuTIuu8dS0w-AflR/view?usp=sharing 

Google Drive Link to Text File to a Scope Action Containing The Flow: https://drive.google.com/file/d/1BVGoeM5mykYlMobAyFkhuLRh3r7jMSLz/view?usp=sharing


Self Update

The SharePoint Self Batch Update assumes you just want to perform simple updates using only the existing data in the list and removes all the actions related to comparing two datasets to find updates. This may be much easier to use if you just want to quickly do something simple like get all the items created in the past month and mark them all with a Complete status.
But you will be limited to using just the data already in the list and any values you can manually input into the flow.

Solution Zip Download Link: https://drive.google.com/file/d/1ZfDGP48fuMSuLQyv_k-tW5ofGsa3dlMC/view?usp=sharing

Self Batch Update Power Automate Legacy Import: https://drive.google.com/file/d/104v4rKSx9fYqfJnvjxwq4jioOE0rjsy-/view?usp=sharing


Upsert V2.7

-Includes a batch create segment to create an upsert capability. If anyone wants to only update records, then they can remove the Batch Create section. If anyone wants to only create records, then they can go to the GenerateSPData action, remove the expression for the ID field and insert the null value expression.

-Further simplifies the set-up, removing the need to add any additional SharePoint Get items actions & removing the need for parallel branches.
-Can now work on lists with a few million items without adding more actions or branches. It also implements a faster load method using the SharePoint HTTP action as described in point 5 of this article.
-The batch loops have been changed from Do until loops to chunking into Apply to each loops so the batch actions can now run concurrently for additional speed. If you have many batches of data you want to process faster, you can try increasing the concurrency settings on the Apply to each loops containing the SendBatch actions.
-The "setting" inputs action was moved to the top of the flow to help accommodate the new streamlined set-up.

-A SP HTTP call now automatically fixes some issues with referencing the correct list name.

-Faster list load time.

-Adds another HTTP call to get the site users into an object indexed/reference-able by email addresses & gives an example of how to use that to batch update a person column. Anytime the updated source dataset has a blank or an email value not found in the top 5000 site users, it will replace any person in that item with a null value.

-Hyperlink columns may require this format

"ColumnName": {
"Description":"UrlTextToShow",
"Url":"Url"
}

So this expression: 

json(concat('{"Description":"', item()?['InsertSourceLinkColumnName'], '","Url":"', item()?['InsertSourceLinkColumnName'], '"}'))


-Multiselect columns may require this format 

"MultiSelectColumnInternalName": {
      "__metadata": {
        "type": "Collection(Edm.String)"
      },
      "results": [
        "Value 1",
        "Value 2",
        "...",
        "Value N"
      ]
}

So something like this expression: 

json(concat('{"__metadata": {"type": "Collection(Edm.String)"},"results":[', item()?['InsertCommaSeperatedMultiSelectValues'], ']}'))

 

Solution Zip Download Link: https://drive.google.com/file/d/1ZfDGP48fuMSuLQyv_k-tW5ofGsa3dlMC/view?usp=sharing

V2.7 Upsert Power Automate Legacy Import: https://drive.google.com/file/d/1KTh00laO4so4cSRPeo8ODW9Vw_vOX5zJ/view?usp=sharing


Upsert Non Unique Keys V1.2

The default upsert template requires you have something that can form a unique primary key & foreign key to match up between source & destination data-sets. The default template will, for each source record, find the 1st matching destination record to update. So if your key is not unique & your destination has multiple records that share the same key that need to be updated, then the default template only updates 1 of the records for each unique destination key value, it does not also update the destination records with duplicate key values.
To update multiple destination records with the same key(s) usually meant doing an Apply to each for each source record, doing a Get items with a Filter query on the destination list for all records relevant to the current source record data, & then updating each relevant destination record in a nested loop.
This set-up offers a possible work-around to batch update multiple destination records that share the same key using XML & Xpath queries. It also allows you to reference both new source values & original destination values in the GenerateSPData action.

(V1.2 Includes performance enhancements so the xPath update can handle larger amounts of data.)

Solution Zip Download Link: https://drive.google.com/file/d/1ZfDGP48fuMSuLQyv_k-tW5ofGsa3dlMC/view?usp=sharing

V1.2 Upsert Non Unique Keys Power Automate Legacy Import: https://drive.google.com/file/d/1FDD7qEYZTTxJvWJc6SCGaIzTlOhX4NLS/view?usp=sharing


Full Sync V1.1 (Combined Upsert & Batch Deletion Sync)

The SharePoint Batch Full Sync template pulls in all the SP List & Source data to perform a batch upsert. But then it also adds on a Batch Deletion Sync to find & delete all the records in the SP List that are not in the Source dataset.

Now, this is initially set up to pull in all the datasource records for all batch actions in the same Do until Get source data loop. And that piece will be limited by the 100MB maximum message / variable size limits for Power Automate, so this Full Sync version will initially only work with datasources with a 100MB or less total size. But this is really only because I'm trying to keep the flow simpler for the majority of users who likely will not have datasources of many 100s of thousands of records.
If you want to further push out against this 100MB limitation, then you will need to separate out the source get data for the batch upsert section from another source get data for the batch deletion sync section. So for the batch upsert section you can use a set-up like in the main batch upsert template where it loads records with all columns 100,000 at a time (or 100 or 5000 or whatever your source dataset per load limitations are) and runs the batch upsert on each source load before running the Do until loop again to get the next source load (which avoids holding anywhere near 100MB in memory at once because it is performing things one load at a time). Then the batch deletion sync section can use a different source get data set-up similar to the "Do until Get destination list IDs + keys" section of the templates where each loop can pull a load from the source dataset & then use a Select action to select only a few of the columns to pass on to the variable holding everything in memory. Since deletions only require the primary key values, you can set the Select to only get the primary key column from each source data load & pass that onto the "Source data outputs" / variable. A full listing of all the primary key values in your source dataset will be much smaller than all columns for the entire table, so that 100MB limit should then hold a few million records worth of the required primary key data to run the batch deletion sync process.

Solution Zip Download Link: https://drive.google.com/file/d/1ZfDGP48fuMSuLQyv_k-tW5ofGsa3dlMC/view?usp=sharing

V1.1 Full Sync Power Automate Legacy Import: https://drive.google.com/file/d/1Op_YwcjZE6ZRx21NLJ31Y7bjiTSAbgwZ/view?usp=sharing



 

Thanks for any feedback,

Please subscribe to my YouTube channel (https://youtube.com/@tylerkolota?si=uEGKko1U8D29CJ86).

And reach out on LinkedIn (https://www.linkedin.com/in/kolota/) if you want to hire me to consult or build more custom Microsoft solutions for you.

Also if you find my free builds/templates helpful, note they are often a by-product of work I do with a USAID contractor.

code snippet widget

Categories:

Comments

  • Lawrence_Sprou1 Profile Picture Lawrence_Sprou1 42
    Posted at
    Batch Update, Create, and Upsert SharePoint Lists
    The error detection in Upsert Version 2.7 misses statusCode 500 errors that occur when submitting long text (over 255 character) to a SharePoint text field.
     
  • FD-24041648-0 Profile Picture FD-24041648-0
    Posted at
    Batch Update, Create, and Upsert SharePoint Lists
    What is the format to update a people field in SharePoint?  I have tried everything and cannot get it to update a single entry people field.
  • FD-24041648-0 Profile Picture FD-24041648-0
    Posted at
    Batch Update, Create, and Upsert SharePoint Lists
    I am trying to import the v2.7 zip and it gets something went wrong and doesnt import.  What am I doing wrong?  I have since fixed this.  I am confused on how to get around the size limit.  I have an excel file with 70 columns and 36k rows and I cannot use the 100000 with pagination.  I can move it to 25000 but get a limit on the Batch insert saying that it is too large.  I would like to know what to change to get it to look at the records past 25000.  Do I use the normal means of changing the skip count to be 25000?
  • CC-21031150-0 Profile Picture CC-21031150-0
    Posted at
    Batch Update, Create, and Upsert SharePoint Lists
    Hello,

    I've had some success setting up the Full Sync V1.1 legacy solution. However, I'm running into a couple of issues which I can't seem to resolve.

    1) I am getting some duplicate entries which aren't present on the source excel file.
     
    2) Only around 1,500 entries appear to be deleted when making significant changes to the list.

    I have attempted to fix this by reducing the batch size and by reducing the degree of parellelism in case I was hitting an unknown limit, however, this has not helped.
     
    Do you have any suggestions on how I could remedy this issue?

    Thanks.
  • Lawrence_Sprou1 Profile Picture Lawrence_Sprou1 42
    Posted at
    Batch Update, Create, and Upsert SharePoint Lists
    While working with the Batch Upsert V2.7, I found that I could get about 30% reduction in processing time by moving two action from the Do Until Upsert tojust after the Combined Array Action.
    so I moved the "Select IndexGetItemsByKeys" and the "Compose ReformatKeysIndex" to before the Do Until Loop for the Upsert.
    This works because the dont need data from inside the loop and dont need to be redone fo each iteration of the loop.
    For large datasets, this could save a lot of time.
  • MD-21011017-0 Profile Picture MD-21011017-0
    Posted at
    Batch Update, Create, and Upsert SharePoint Lists
    UPSERT operation is really helpful to us. Input Source is POWERBI. So some logic we have implemented external skip token and do until logic. It is working as charm.
     
    Assume:
    varTotal = 8832
    varChunkSize = 200
    varIndex starts at 0
    Iteration 1
    remaining = 8832 - 0 = 8832
    min(200, 8832) = 200
    New varIndex = 0 + 200 = 200
    … later …
    Second to last iteration (when 120 remain)
    Suppose varIndex = 8712
    remaining = 8832 - 8712 = 120
    min(200, 120) = 120
    New varIndex = 8712 + 120 = 8832
    Now your Do Until condition (e.g., @greaterOrEquals(variables('varIndex'), variables('varTotal'))) evaluates to true, and the loop ends neatly with varIndex == varTotal.
    What each part means

    variables('varIndex')
    Your current position (start offset) in the master array.

    variables('varChunkSize')
    How many items you want to process per batch (e.g., 200).

    variables('varTotal')
    The total number of items in the master array (e.g., length(variables('varBatchLines'))).

    sub(variables('varTotal'), variables('varIndex'))
    The remaining number of items to process: remaining = total - index.

    createArray(variables('varChunkSize'), sub(...))
    Builds a two-element array: [chunkSize, remaining].

    min(createArray(...))
    Takes the smaller of {chunkSize, remaining}; i.e., “how many items should we advance the index this iteration?”

    add(variables('varIndex'), min(...))
    Moves the index forward by exactly the amount you processed this round.
    ----------------------------------------------------------------------------------------------------------------
    Inside the Do until, but after the inner Apply to each (the one that processes the current chunk). The order inside Do until should be:
    Compose – Current Chunk
    take(
      skip(variables('varBatchLines'), variables('varIndex')),
      variables('varChunkSize')
    )
    Apply to each – Process Current Chunk

    Input: output of the “Current Chunk” compose.
    Do your Create/Update (upsert) here.
    (Turn on Concurrency in the inner loop if needed.)

    Set variable – Increment Index
    Use your defensive expression:

    add(
      variables('varIndex'),
      min(
        createArray(variables('varChunkSize'), sub(variables('varTotal'), variables('varIndex')))
      )
    )
    Do not put the increment inside the inner Apply to each. It must run once per chunk, not once per row.

       
  • PowerApps01 Profile Picture PowerApps01 15
    Posted at
    Batch Update, Create, and Upsert SharePoint Lists
    Hi @takolota1 (or anyone else that may be able to help),
     
    I'm trying to use the upsert 2.7 but have Power BI dataset (run a query against a dataset) as my datasource but don't know what I need to change or add to the flow, your help is much appreciated.
  • mpdrsn Profile Picture mpdrsn
    Posted at
    Batch Update, Create, and Upsert SharePoint Lists
    Great work!
    Before I implement this, does the batch update version the item? Or is there a way to avoid this without a loop?
  • racheltsky Profile Picture racheltsky
    Posted at
    Batch Update, Create, and Upsert SharePoint Lists
    Is there any way to use this for image columns?
  • mm_512 Profile Picture mm_512 59
    Posted at
    Batch Update, Create, and Upsert SharePoint Lists
    Hello Tyler,
    I'm using the "Title" column as the key. The flow runs, but I get {"error":{"code":"-1, Microsoft.SharePoint.Client.InvalidClientQueryException","message":{"lang":"en-US","value":"The expression \"web/lists/getByTitle('Gr12Assets')/items(PF2YC71G)\" is not valid."}}}.

    The Select output gives me all the rows in the changeset:

    The "SendBatch" action fails and no record gets updated. Any thoughts where I am going wrong? Is it the choice of my key column? 
    The batchData action's output is: