I'm picking up a file from an FTP (csv ~ 200k rows). The goal is to move the file to somewhere our on prem servers can read it in SQL Server, then bulk insert the data into a table. The issue is that the file has some commas in it that mess with the reading and writing into the database. The issue is removing those commas in the fields to get them ready to upload.
One route I have tried is using office script on a blank excel file to populate it with content of the csv file so that I can then take out the commas in the xlsx with another script. However, this has been working with smaller files but always fails with the actual file I need to use.
I have also tried using power automate desktop to open the file in Excel on a remote machine, then close Excel and save it as an xlsx so I can remove the commas but have been running into issues there as well.
I have seen many solutions to this issue but because the file is so big, any solution involving an "apply to each" loop would take too long.
I can elaborate on any of the above points. Any help is greatly appreciated!