Hi,
I have an API wich return a text, suppossed to be a csv file, in the form below :
datetime;mois;semaine;joursem;heure;vacance;Text;freQ;Scored Labels 7/15/2017 6:00:00 PM;7;28;7;18;1;29,234;67;148,2313385 10/14/2016 4:00:00 AM;10;42;6;4;0;18,922;0;-9,692166328 02/04/2017 12:00;2;5;7;12;0;9,239;0;39,99219513 05/11/2017 05:00;5;19;5;5;0;17,421;0;1,262338638 10/01/2016 13:00;10;40;7;13;0;22,333;2;-0,870968521 11/20/2016 6:00:00 AM;11;48;1;6;0;11,83;0;-13,13813114 10/18/2016 4:00:00 PM;10;43;3;16;0;20,529;42;46,49481583 2/23/2018 9:00:00 AM;2;8;6;9;0;1,231;0;1,8540411 01/05/2017 05:00;1;1;5;5;1;6,426;0;0,300328046
I don't need to save a csv file, I need to transform this to a table to save 2 specifc columns on a sharepoint list.
I tried create CSV table but, I can't make it work.
Thank you for your help.
Paulie78's process to convert csv to JSON worked perfectly for me. thank you!
Thanks @DamoBird365,
Yes, the batch actions can be more complicated to work with & understand.
I personally do change out the Results compose actions with Append to variable actions and use an expression that returns the results text if it identifies an error. Then at the end of the flow I have a condition checking the length of that variable. If it has some text/length to it, then something failed and it sends me an email with a link to the flow run.
However that’s a lot to explain on top of the main parts of the set-up so I don’t tend to cover that in my explanation videos & posts.
Hi @takolota
Thanks for sharing. I've played with the batch process too, it's complex to setup but worth it for efficiency. If I recall right, the batch response has individual responses for each api call, so you might want to filter them for failures, as the overall action will succeed. @Paulie78 will be delighted to see you have enjoyed his video.
I've also since explored using office scripts to parse a CSV to Excel which uses a regular expression https://youtu.be/9J6ThPWGaG0.
Great to see different possibilities and options!
EDIT: This only works if the records with commas have double quotes around them or if you can save the CSV in a text file format where it will apply the double quotes to those records with commas.
Hello All,
I ran into a similar issue when pulling CSV data in from Power Automate Desktop. It comes in as a big string and some columns have commas in the actual data. I also want to process potentially thousands of rows at a time so I need it to be more efficient than these Apply to each loop options.
So I custom built something in Power Automate that can change the delimiter of just the column separator commas. No loops, only a few actions regardless of the number of rows. I currently only set it to handle up to 5 columns with actual commas in their data, but I can expand that as much as needed.
From there I can pretty easily use the Select action splitting on line breaks for the From field and splitting on my new delimiter ( | ) in the value fields.
Then I use that Select action within a SharePoint batch create or SharePoint batch update set-up to load 500 rows at a time with only a few actions.
Create: https://youtu.be/2dV7fI4GUYU
Update: https://youtu.be/l0NuYtXdcrQ
I get an "Access Denied" when I try and view the link pointed to by the solution. Why?
Hi @tutankh
Thanks for the feedback. I appreciate what you mean with regards to a delimiter within a field, like a quote encapsulated CSV. I’ve a demo here where I convert an example into pipe delimited to cater for that exact scenario but not a copy paste sadly.
The copy paste solution though does cater for multiple fields and using the parse json for the unknown field names as they are dynamically generated from the header line. It also includes the sample payload. I tried to think of a way to make the number of keys/values dynamic too but stuck with 12 based on my example to then allow the next person to remove (or copy/increase) the number of lines as they see necessary.
Did you try the whole copy paste solution or just the basic select action?
I’m keen to explore new ideas as even if this isn’t the best solution, I learn by exploring the possibilities.
Cheers,
Damien
P.S. take a look at my new blog here
I like that this is a copy paste solution for simple csv parsing. Your particular solution doesn't (and I don't think it's meant to, so this isn't a finger-pointy type thing) account for CSVs when you don't know the field names or the number of them that you might have. It also will break if any of your column values contain your delimiter. So if you were parsing a CSV that had AD distinguished names in it or something, it wouldn't work.
Otherwise - good copy/paste!
@pedwin / @tutankh / @Paulie78
I just wanted to throw my idea into the mix - I know we are waiting on a solution via ideas but why not take a look at this option
I've developed a copy /paste solution which allows you can quickly re-use for different sized CSV's and quickly update the payload for the parse JSON action.
Cheers,
Damien
P.S. take a look at my new blog here
@pedwin In addition to the video, I've written up the process for parsing the CSV on my blog: How to Parse CSV files with Power Automate
You can download my flow from there and import into your environment to get you started quickly.
Thanks Paulie78 - this definitely helps as my use case involves consuming multiple csv files that has fewer than 50 records.
But I do agree with tutankh, Microsoft should provide this feature as an builtin action.
WarrenBelz
146,731
Most Valuable Professional
RandyHayes
76,287
Super User 2024 Season 1
Pstork1
66,075
Most Valuable Professional