web
You’re offline. This is a read only version of the page.
close
Skip to main content

Notifications

Announcements

Community site session details

Community site session details

Session Id :
Power Platform Community / Forums / Power Automate / How to convert csv to ...
Power Automate
Unanswered

How to convert csv to JSON with high efficiency

(2) ShareShare
ReportReport
Posted on by 39
Hi there, 
 
I have a business need which is to sync csv files in SharePoint folder to an on-premise SQL Server database.
 
The solution that I'm adopting is to convert the CSV file into JSON format first and use a stored procedure in SQL Server to update into database.
 
The current flow works fine but with an efficiency issue and may exceed the Power Platform Requests transition limit.
 
Is there a way to avoid explicit loop in my flow to improve efficiency as well as reduce Power Platform Requests used? 
 
Please see the attachment for more details.
 
Best regards,
Sebastian
Categories:
I have the same question (0)
  • Verified answer
    WillPage Profile Picture
    2,307 Super User 2025 Season 2 on at
    I think the best way is to hand off the CSV to JSON conversion to a 3rd party connector or build your own in an Azure function. You can do it in a few lines of PowerShell.
     
    But if you can't for whatever reason (premium Power Automate licensing being the most obvious), then consider using a Select instead of Apply to each.
     
    Add a Select, The input is the output of your splitNewLine Compose action. Click the button on the right to change the Select from Key-value input to JSON input and put whatever formula you have in your splitByComma Compose in there, except instead of items('Apply_to_each') it's just item() to reference the row context.

    Then you can delete the Apply to each and the Output of the Select will be the same as your outer compose you've got below the loop now. It'll run in seconds instead of minutes.
  • Suggested answer
    Riyaz_riz11 Profile Picture
    3,844 Super User 2025 Season 2 on at
    Hi,
     

    Option 1: Bulk Insert in One Request

     

    Instead of looping, send all rows to SQL in one call.

     

    How to do this:

    1. Read the CSV in Power Automate.

    2. Compose the entire file (or a chunk) into a JSON array or delimited string.

    3. Pass the whole data payload into your stored procedure.

    4. In the stored procedure:

      • Parse the JSON (using OPENJSON) or split delimited text.

      • Do your INSERTs or MERGEs in set-based operations.

     Example Flow Outline

    Power Automate:

    • Trigger: When file created or modified in SharePoint folder.

    • Get file content: Get the CSV.

    • Compose: Convert to JSON array (you can do this with Create CSV TableParse JSON).

    • Call stored procedure:

      • Pass one parameter with all JSON (text).

      • e.g., EXEC dbo.SyncCsvData @Json = N'[{...},{...},...]'

    SQL Server Stored Procedure:

    CREATE PROCEDURE dbo.SyncCsvData
    @Json NVARCHAR(MAX)
    AS
    BEGIN
    SET NOCOUNT ON;

    -- Parse JSON into table
    INSERT INTO YourTargetTable (Col1, Col2, ...)
    SELECT
    j.[Col1],
    j.[Col2],
    ...

    FROM OPENJSON(@Json)
    WITH (
    Col1 NVARCHAR(
    200),
    Col2
    INT,
    ...
    )
    AS j;
    END

    This avoids any looping in Power Automate.

     Option 2: Use SQL Server’s BULK INSERT

    If you have direct access to the file path from your SQL Server, you can avoid sending the content over Power Automate altogether:

    • Power Automate saves the CSV to a network share accessible to SQL Server.

    • Then you trigger a stored procedure that does:

      BULK INSERT YourTargetTable
      FROM '\\FileServer\Path\yourfile.csv'
      WITH (
      FIELDTERMINATOR
      = ',',
      ROWTERMINATOR
      = '0x0a',
      FIRSTROW
      = 2
      );

    • This is the fastest approach, but requires:

      • SQL Server can reach the file.

      • SQL Server service account has permissions.

     Option 3: Use Dataflows or Azure Data Factory

    If your CSVs are large or frequent, Dataflows or Azure Data Factory pipelines are a better fit:

    • Dataflows (Power Platform):

      • They can pull CSVs from SharePoint directly and ingest into SQL tables without looping.

    • Azure Data Factory:

      • Designed for exactly this: batch ETL.

      • It can do:

        • Incremental loads.

        • Large files.

        • Scheduling.

        • No per-row API calls.

    If I have answered your question, please mark it as the preferred solution ✅ . If you like my response, please give it a Thumbs Up 👍.
    Regards,
    Riyaz

  • Verified answer
    Chriddle Profile Picture
    8,434 Super User 2025 Season 2 on at
    Instead of using “Apply to each,” use “Select,” as shown here:
  • Sebastian_Zhao Profile Picture
    39 on at
    The Data Operation Select solved my problem perfectly. 
    I've also encountered the problem of having a blank row in my source csv file and this post is just for it! https://community.powerplatform.com/forums/thread/details/?threadid=2f2b5260-d737-f011-8c4e-000d3a990cc2

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Forum hierarchy changes are complete!

In our never-ending quest to improve we are simplifying the forum hierarchy…

Ajay Kumar Gannamaneni – Community Spotlight

We are honored to recognize Ajay Kumar Gannamaneni as our Community Spotlight for December…

Leaderboard > Power Automate

#1
Michael E. Gernaey Profile Picture

Michael E. Gernaey 523 Super User 2025 Season 2

#2
Tomac Profile Picture

Tomac 406 Moderator

#3
abm abm Profile Picture

abm abm 245 Most Valuable Professional

Last 30 days Overall leaderboard