web
You’re offline. This is a read only version of the page.
close
Skip to main content
Community site session details

Community site session details

Session Id : YtOKsbs9crBKcqPY6gAsb+
Power Apps - Microsoft Dataverse
Answered

Dataverse Load Performance Benchmarks

Like (0) ShareShare
ReportReport
Posted on 2 Nov 2022 17:23:20 by 5

Hello:  I'm looking to load a non-trivial amount of data into dataverse (~10-15 million records for 2 entities and a several that are between 500k and 1 million) using Synapse Pipelines.  We're looking to plan for the production move and need to get a rough estimate of how long it will take (and ideally upgrade as necessary to keep this time as small as possible).  I have read through the API documentation but I was wondering if there's a way to determine what the load times will be.  For instance, if 1million records takes 8 hours, I don't believe it's linear in calculating that 12 million would be 12*8 . I believe the Sandbox instances are a good bit slower in general than production ones but I'm not sure how to quantify it.

1-Is there  something we can add in terms of licensing that will speed up performance on loads , assuming that we're sure the bottlenecks are on the Dataverse side?

2-Is it correct that there's a fairly signficant difference between Sandbox and Prod instances?  I am pretty sure there is, but if so, is there any way to approximately quantify the differences?
3-Anyone done any loads of this size and have any general thoughts or suggestions?

Categories:
  • Bill Ryan Profile Picture
    5 on 04 Nov 2022 at 14:12:34
    Re: Dataverse Load Performance Benchmarks

    Thank you.

  • Verified answer
    ChrisPiasecki Profile Picture
    6,412 Most Valuable Professional on 03 Nov 2022 at 23:11:47
    Re: Dataverse Load Performance Benchmarks

    Hi @Bill_Ryan33149,

     

    I dont recall if there is published information on how resources are allocated to servers exactly, but I do believe the environment is put on an appropriate VM tier based on a number of factors like number of licensed users in the environment, number of transactions,  database size, and other metrics. We have no control over this. 

     

    Things that will improve performance include:

    • Batching requests 
    • Use multi threading (you'll need to tweak # of parallel threads and batch size number to find your optimal 
    • Disable plugins and workflows that could trigger on creation 
    • Disable plugin trace logging 
    • Disable auditing 
    • Disable duplicate detection
    • Minimize network latency where possible by having the source data as close to the target environment is possible (ideally same geographic region). If you can copy the data to Azure then that should help even further
    •  Number of columns and lookups in particular will impact throughout. 

     

    Hope this helps. 

     

    ---
    Please click Accept as Solution if my post answered your question. This will help others find solutions to similar questions. If you like my post and/or find it helpful, please consider giving it a Thumbs Up.

     

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Responsible AI policies

As AI tools become more common, we’re introducing a Responsible AI Use…

Telen Wang – Community Spotlight

We are honored to recognize Telen Wang as our August 2025 Community…

Congratulations to the July Top 10 Community Leaders!

These are the community rock stars!

Leaderboard > Power Apps

#1
WarrenBelz Profile Picture

WarrenBelz 637 Most Valuable Professional

#2
stampcoin Profile Picture

stampcoin 570 Super User 2025 Season 2

#3
Power Apps 1919 Profile Picture

Power Apps 1919 473

Featured topics