Hi All,
I am new to dataverse and was tasked with moving a very large (500,000 rows) data capture from excel to dataverse.
The database consists of a Primary ID column, and various make/models for each ID.
There are 2000+ models, so I have created a table within the solution for models.
In the new Dataverse data capture I have been using Edit in Excel and Copying over Primary ID's, then matching those ID's to the unique identifier keys for the lookup column.
I am now at 400,000 rows and 4 out of 5 times I attempt to publish It will state "Retrieving workbook data..." for several minutes then I receive the error "Could not retrieve new data from workbook", If I keep attempting to publish the data, occasionally it will go through. Sometimes If I close and open a new version and reload all the data (taking 10+ minutes including freezes/hitch ups) then I can publish another 1000 rows of data before meeting the same error.
In a similar vein to this, if I publish 1000 ID's and 1000 lookup keys, and it successfully publishes, most of the lookup keys are deleted, and need to be pasted back in and 900 lookup keys need to be republished again.
Is there any possible fix for this to allow consistent publishing of data without freezing errors?
stampcoin
17
mmbr1606
15
Super User 2025 Season 1
ankit_singhal
11
Super User 2025 Season 1