I needed to import several Excel contract files into a SharePoint list. Each contract area is a defined 'table' structure (though they are not actual tables).
I generated a script to handle converting those tables into a custom contract object. It also converts a free-form worksheet into strings of HTML tables. It then attaches all the tables to one contract object and returns that object as a JSON string.
Some files worked without issue, but some files had some pretty big 'freeform' worksheets that caused the script to take longer than 2 minutes to run, so I broke the freeform HTML script out into its own script, and added some time calculations to it to make sure it closed out of the script before the 2 minutes were up (data transfer time seems to count as script run-time, so I cut it short at 90 seconds so it has enough time to transmit the data back). Then I modified my flows to run the first script to grab most of the contract objects, and then run the second script as many times as needed to grab the entire HTML text from the freeform worksheet.
After some trial and error (and viewing/modifying script/flow to understand/resolve those errors) I now have 3 lovely flows in a solution that work magically together to do all my importing in the background and write the JSON contract string to a custom column in my Document Library. (multi-line text to hold the JSON string so I can pull it into a Canvas App when I select the file in a gallery)
One thing I found after I got everything working is that I couldn't write the JSON string to the file's custom column after the script was finished. I searched this forum (among others) and found that the Excel Online connectors actually keep the file locked (supposedly so it can 'make sure' all changes are committed to the file) for more than 5 minutes. I did extensive testing on this, and I finally settled on a flow that placed a delay of 5 minutes and 30 seconds before it tried to write the JSON string to the custom column. I then ran it on nearly 300 contracts with only about 1% needing to wait longer for the lock to timeout.
It takes less than 2 minutes for my scripts to run on most of these files. None of the scripts are making any writes to the files at all. (Just reading data and returning that data to the caller) If I run the script with the file open and then close the file, I can modify the file record immediately, but not when Excel Online runs the script.
Look... I know Microsoft is unlikely to see this as an 'issue' and even less likely to do anything about it if they do concede that it is an 'issue', so I've come up with a work-around idea and I'm coming here to see if the experts here think it's worth it, or if I should just leave what's working alone.
Here's my thought. I could add a step to copy the file to a temporary location and run the scripts on the copy. This should allow me to make the modifications to my original file's custom column directly after the script has finished. But then I may (haven't tried this yet) have to wait 5+ minutes to be able to delete that copy and then it will go in the recycle bin and sit for however long files sit there.
Do any of you have any thoughts on this?
@mdevaney wrote about this, check it out:
https://www.matthewdevaney.com/4-solutions-for-excel-file-is-locked-error-in-power-automate/#Solution--2---Delete-The-Excel-File-&-Bypass-The-Shared-Use-Lock
You can bypass the file lock with Send an http request to Sharepoint action and delete the locked file.
rpersad
16
David_MA
10
Super User 2025 Season 1
Michael E. Gernaey
8
Super User 2025 Season 1