I have a use case where I need to copy sharepoint files(2-3 file uploaded yearly) to Azure blob storage.
Each file that is uploaded on sharepoint contains a year. Eg : 17 XYZ 2024, 15 XYZ 2024, 23 XYZ 2023 (test)
While writing the files I have to give file names as XYZ_YEAR, YEAR fetched from the original file name. Eg : XYZ_2024, XYZ_2023
For the first 2 examples I am able to fetch year easily using this function - last(split(variables('file'),' ')). However, this does not work if we don't have year at the end as the uploaded file name is or will not be consistent.
I just want to fetch year from the file name irrespective of it's placement in the filename.
Thank you for the help in advance.
It's kind of urgent, any help would be appreciated 🙂
Can we rename the excel sheet name while copying the excel files to another location?
Thanks for the help in advance
Thank you so much 🙂
I was able to implement it for my usecase. Much appreciated 🙂
Select
From:
split(
xpath(
xml('<a/>'),
concat('translate("',outputs('Compose') ,'", "-+_(){}[]", " ")')
),
' '
)
The xpath() function is just a fancy shortcut to replace all occurrences of the characters -+_(){}[] with a space.
If you find this scary, replace it with nested replace() functions.
Add other possible separators if necessary.
Then this is splitted at space.
Map:
if(
and(
equals(
length(item()),
4
),
isInt(item()),
startsWith(
item(),
'20'
)
),
item(),
''
)
Checks each item if
and, if so it returns its value otherwise an empty string.
Add more tests if needed.
Compose 2
if(
equals(
length(
join(
union(body('Select'), json('[]')),
''
)
),
4
),
join(
union(body('Select'), json('[]')),
''
),
'noYearInFilename'
)
Checks whether there is a clear result (and returns it)
Any leads guys? I haven't able to figure out the solution yet.
This will remove manual intervention and could help with triggering pipeline as well. Right now the uploader has to raise a ticket and wait for the datafactory pipeline to be ran manually. With this process we will have an event trigger in the datafactory side which run the due process automatically and give instant result. Saving 0.5-1 day of wait time for users and dependency on support team to take file from sharepoint and upload it on blob storage
You are trying to automate things that don't need to be automated. The cadence is way too infrequent.
This file is created by multiple persons, point of contact keep on changing and they don't adhere to the convention for us. We want to automate the solution, earlier we used to manual take the file and change it to our needs.
It will be much simpler to enforce file naming conventions and to rename files that fail to meet the convention rules.