web
You’re offline. This is a read only version of the page.
close
Skip to main content

Announcements

News and Announcements icon
Community site session details

Community site session details

Session Id :

Broken Page, Can Not Edit

takolota1 Profile Picture Posted by takolota1 4,980 Moderator

A way to generate efficient SQL batch queries for any CRUD operations in any flow or Power App.

This uses Power App formulas or flow actions to transform Power Apps table data or flow action output JSON data into a type of CSV data. Then the main part of the flow efficiently transforms that CSV-like data into SQL queries that can Create, Read, Update, or Delete all that data within a single SQL Query action. There are no slow-running loops in the app or flow.

Power Apps Test Set-Up
BatchSQL1.png
BatchSQL2.png

Example Collection Data 

 

ClearCollect(Collection1,
{ID: 1, Comment: "Test1", Col2: "Test1.2"},
{ID: 4, Comment: "Test4", Col2: "Test4.2"},
{ID: 5, Comment: "Test5", Col2: "Test5.2"},
{ID: 6, Comment: "Test6", Col2: "Test6.2"},
{ID: 7, Comment: "Test7", Col2: "Test7.2"});

 

 

Example Collection Data Transformation & Flow Submission 

 

//Format the collection data for the flow, then send the data & parameters to the flow.
//Set delimiters. Choose character combinations very unlikely to show up in any of your data.
Set(LineDelimiter, "~|"); 
Set(Delimiter, "#$");
//Create the CSV-like dataset with custom delimiters. Columns names in the 1st header row must match the column names in SQL. Any string data must have single quotes ' ' around it. Other datatypes like numbers do not require anything extra.
Set(DataCSV, 
Concatenate("ID",Delimiter,"Comment",Delimiter,"Col2",LineDelimiter, 
Concat(Collection1, Concatenate(Text(ID),Delimiter,"'",Comment,"'",Delimiter,"'",Col2,"'",LineDelimiter)))
);
//Remove the trailing linebreak characters at the end of the data 
Set(DataCSV, Left(DataCSV, Len(DataCSV) - Len(LineDelimiter))); 

//Send the parameters & data to the flow.
//Service, Database, Table, CRUD-Operation, SQL Primary-Key Column Name, Delimiter, Line-Delimiter, Data, & Read-Query (Only for READ calls like "SELECT * FROM Table1", enter a dummy value if not a read call).
Set(DataOutput,
'SQL-BatchActions'.Run("SQL", "TestDatabase", "dev.TestTable", "UPDATE", "ID", Delimiter, LineDelimiter, DataCSV, "").dataoutput
);
//Get any response in the variable DataOutput

//Put in the double quotes "" before & after each string value in each Regex key value pair, but do not use those double quotes "" with any non-string datatypes. 
//So if one of the middle key value pairs was "Test2": "AbcString", then use the Regex ""Test2"":""(?<Test2>[^""]*)"" but if it was "Test2": 123, then use Regex ""Test2"":(?<Test2>[^""]*) 
//Also the 1st key value pair includes the starting object curly bracket with the escape backslash \{ and the last key value pair includes the ending object curly bracket with the escape backslash \} 
ClearCollect(Collection2,MatchAll(DataOutput,"\{""ID"":(?<ID>[^""]*),""Comment"":""(?<Comment>[^""]*)"",""Col2"":""(?<Col2>[^""]*)""\}"))
//More information on this JSON parsing method here: https://youtu.be/2xWiFkBf294

 

 

 


Power Automate Flow Overview Picture
BatchSQL3.png

Example Run Update Section
 BatchSQL4.png

You can download & import the example flow below. Thanks for any feedback!

Categories:

Comments