OK Flownauts -- need your help please!
I have a seemingly simple flow - I am running a SQL Server Stored Procedure via an on-premise gateway. The SP runs fine and returns data. I currently have it set up to only return a single row (TOP 1) for testing purposes.
I then pass the result set to a Parse JSON so I can readily utilize the variable data fields. I used the payload to create the schema and the parse runs fine, rendering all the data into the proper variable names.
I then try to use the output of the Parse JSON to create an SP item and that is where I hit the error.
"The execution of template action 'Apply_to_each_2' failed: the result of the evaluation of 'foreach' expression '@body('Parse_JSON')?['ResultSets']?['Table1']' is of type 'Null'. The result must be a valid array."
I have tried everything I can think of to debug the issue by trying to simplify to isolate the problem:
1. I deleted and readded the create item (several times).
2. I created a brand new SP List with only the title column and tried to use it instead of the one I really want to use
3. I tried storing the results set from the SP in an object variable and passing that to the Parse JSON (this seemed to solve the problem for the person in this post: https://powerusers.microsoft.com/t5/Using-Flows/Stored-Procedure-Table1-is-of-type-Null/m-p/317851)
4. I removed the create item completely and instead used a initialize variable / set variable pair to try to save the contents of the Name field from the Parse JSON to a string variable.
All resulted in the same error, as shown above, on the Apply to Each.
The image below is based on the simplest case, #4 in the list above. Left side is the flow deta - right side is the result of running the flow. You can see the Parse JSON is executing successfully and contains the output I would expect to see in each field. But when I try to use it... the flow fails every time.
How can I successfully utilize the JSON output???? This one has me baffled.
LRVinNC
For anyone still encountering this issue, I found that you can get the Parse JSON statement to handle SQL Stored Procedures successfully by reformatting it a bit.
The problem is the Stored Procedure outputs the results in one Object holding the single array, titled "Table1". For example:
{
"Table1": [
{
"FirstName": "Sam",
"LastName": "Stern"
}
]
}
When it needs to just have the array, like this:
[
{
"FirstName": "Sam",
"LastName": "Stern"
}
]
This needs to be changed in both the schema and the content for the Parse JSON step. To get the schema I took the result set from the Stored Procedure and removed the wrapping { Table1: } at the beginning and end, then entered that into the Parse JSON as a sample payload to generate the schema. So the sample payload should look like the 2nd example above.
To get the content I added Compose steps to transform the Stored Procedure resultsets into a string, then take a substring excluding the first 10 characters (the first 10 characters are {"Table1": ) and the last one character (the closing } ) to strip the result down to just the array that we want. Then I reformat it as JSON using the JSON() function and use it as the content input for the Parse JSON Step.
For more detail, please reference this blog I wrote about it:
I had the exact problem and the solution mentioned has helped. Thanks very much @RezaDorrani for the solution.
The only change I had to access properties was to use items instead of item --> items('<Name of Apply to each step>')?['name']
I've run into an issue in the past with JSON where the inboud data did not match the schema the JSON action built even though I copied it from the actual api.
This happened with a custom api connector I had set up.
What I ended up doing is removing one parent level from the inbound data of the nested JSON array object and then it found my data.
In my scenario this had to do because the create custom connector test run was returning a slightly different JSON than what the Flow action returns.
Hi there!
Try adding the "null" bit to the expression builder rather than typing it into the condition box. I've seen instances where Flow doesn't treat the values correctly if they're not an expression.
So... @edgonzales, I added a condition to check for null in name before the append to string and lo and behold, even though I SEE a value in Name in the output of the Parse JSON, the following step only sees Null. Any thoughts on why that could be?
@edgonzales Yep, same error.
The execution of template action 'Apply_to_each' failed: the result of the evaluation of 'foreach' expression '@body('Parse_JSON')?['ResultSets']?['Table1']' is of type 'Null'. The result must be a valid array.
I stripped the scheme down to just:
{ "properties": { "OutputParameters": {}, "ResultSets": { "properties": { "Table1": { "items": { "properties": { "Name": {} }, "required": [ "Name" ] } } } } } }
Interestingly, it still outputs ALL of the fields, even though the schema only includes Name. I suspect that has something to do with why it is still failing. Reading John Liu's article I would have only expected it to output the fields I left in the schema, but that doesn't seem to be the case.
Are you getting the same error as before, or is it something different now? Looking at the Flow, it might get cranky about having a "Set Variable" inside the Apply to Each loop. Every time I do that, it yells at me because it knows it's going to just overwrite itself with each successive run.
If the variable needs to be set in that loop, try using "Append" instead. If you're only getting one result, or can format the input as JSON, then it should work ok.
If it's the same error, double-check the output from previous steps. I'm not familiar enough with SQL, but if the query is broken and doesn't offer any results, it might still cause some pain.
Lastly, try clearing the line below
"type": "array";
It's a stretch, but might be the thing.
-Ed-
@edgonzales and @RezaDorrani thank you both for quickly coming to my rescue!
The good news is that I DO now have a working flow and loaded 60 test records successfully. I had to do the expression route to get it to work - thanks @RezaDorrani . (I had actually tried that earlier too but couldn't get it to work. Even the first time I tried in with your guidance @RezaDorrani it still didn't work, but I remembered something I did to make the expression in a past flow work and so I simplified the name of the SP step to just Step1 and with that was able to get it to work. Once it worked for name, I loaded it up for all 19 columns.
Before that, however, being inherently lazy and not wanting to have to do all that typing (LOL), I tried to fix the JSON because it is a lot easier to use dynamic content than brute force all the expressions. But, even after reading John Liu's excellent post as well as your blog @edgonzales (thanks for both!), it still didn't work. I would still like to figure out why this is because I'm going to write a blog post for my site on this when I get it working - demonstrating both methods. First I removed all the type specifications from the schema as you suggested ... same result. Then I removed all but Name from the required list... same result. Then I removed all but Name from the properties section too...same result. The Parse JSON step executes just fine in all cases and appears to correctly populate the fields in question in the output side but then barfs on the following set statement. Just FYI, here's the schema from my last, simplest run. By the way, none of the fields coming across can EVER be null because there is a COALESE in the SQL for each one to prevent it.
{ "type": "object", "properties": { "OutputParameters": { "type": "object", "properties": {} }, "ResultSets": { "type": "object", "properties": { "Table1": { "type": "array", "items": { "type": "object", "properties": { "Name": {} }, "required": [ "Name" ] } } } } } }
Again, thanks to you both for your help - I'm goign to go ahead and mark this as solve, but @edgonzales if you have any more ideas on the Parse JSON side please let me know.
Hi there!
Take a look at John Liu's Thesis on Parsing JSON... it was a big help for me. Here's what I think is happening...
In your schema, it's assigning a type of "array" to one of the values. And that value is coming back in the results blank ("null") which is NOT an array and Flow is getting cranky. I've found the fix to be just removing the "type" lines from the schema...that way if Flow gets something, we're good...if it doesn't, we're still good. Check out my blog here, as well.
When we upload a sample payload, that's when Flow tries to assign a type to everything. So I just make it a habit of clearing that out especially if I think the value will come back null.
Keep us posted.
-Ed-
If this reply has answered your question or resolved your challenge, please consider marking it as a Solution. This helps other users find it more easily via search.
Hi @LRVinNC
I am not a fan of Parse JSON so my solution would be different (I am an expression fan)
use for loop expression as
body('Execute_LLExtract4Load_SP')?['ResultSets']?['Table1']
and then within the loop to get properties just use
example - item()?['Name']
Regards,
Reza Dorrani
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly
WarrenBelz
146,645
Most Valuable Professional
RandyHayes
76,287
Super User 2024 Season 1
Pstork1
65,997
Most Valuable Professional