web
You’re offline. This is a read only version of the page.
close
Skip to main content

Notifications

Announcements

Community site session details

Community site session details

Session Id :
Power Platform Community / Forums / Power Automate / SQL stored procedure o...
Power Automate
Unanswered

SQL stored procedure on-premises data gateway timeout

(1) ShareShare
ReportReport
Posted on by 401

I found  I can run an SQL stored procedure on the local SQL server and it is very handy when the procedures are short lived. However, when the time the procedure takes to run, exceeds 2 minutes, the Flow seems to do a couple of extra things that I do not understand.

 

When I manually run the SP that I want to launch via Flow, it takes 5 to 7 minutes to run, where it basically pulls data from a number of sources (linked servers) via a local view, and combines them into a local table. This always works as expected, there is no issue with the SP.

 

When I run the SP via Flow it does do the first part of the flow, where it drops the existing table, and then things become unclear. After 2 minutes it signals a timeout, and from then on it seems to hammer the SQL server, since my SQL studio seems to have a hard time getting any response. The flow and the sever then go in limbo for about 30 minutes and seemly do nothing but jamming.

 

The flow is running with my personal credentials, the same as I use in the SQL Studio. 

 

Is there anything I can do to prevent this timeout and retry after 2 minutes?

Is there anything I can do to prevent the explicit cancellation?

Any suggestions to make this work as expected?

 

I did look at and adjusted the timeout settings, but I think the 'Note' tells me it will not work. 

Timeout:  Limit the maximum duration an asynchronous pattern may take.
Note: this does not alter the request timeout of a single request.

 

GatewayTimeout.jpg

 

The error message: 

{
"error": {
"code": 504,
"source": "flow-apim-europe-001-northeurope-01.azure-apim.net",
"clientRequestId": "bba12345-a123-b456-c789-cf64d495e8d1",
"message": "BadGateway",
"innerError": {
"status": 504,
"message": "The operation failed due to an explicit cancellation. Exception: System.Threading.Tasks.TaskCanceledException: A task was canceled.\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.PowerBI.DataMovement.Pipeline.Common.TDFHelpers.<>c__DisplayClass7_0`1.<<GetNextResponse>b__0>d.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.PowerBI.DataMovement.Pipeline.Common.TDFHelpers.<>c__DisplayClass11_0.<<ExecuteBlockOperation>b__0>d.MoveNext()\r\n inner exception: The operation failed due to an explicit cancellation. Exception: System.Threading.Tasks.TaskCanceledException: A task was canceled.\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.PowerBI.DataMovement.Pipeline.Common.TDFHelpers.<>c__DisplayClass7_0`1.<<GetNextResponse>b__0>d.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.PowerBI.DataMovement.Pipeline.Common.TDFHelpers.<>c__DisplayClass11_0.<<ExecuteBlockOperation>b__0>d.MoveNext()\r\n inner exception: A task was canceled.\r\nclientRequestId: bba12345-a123-b456-c789-cf64d495e8d1",
"source": "sql-ne.azconn-ne.p.azurewebsites.net"
}
}
}

 

The stored procedure:

DROP TABLE IF EXISTS [LocalDB].[dbo].[MH_test_dev]

SELECT * INTO [LocalDB].[dbo].[MH_test_dev] FROM [LocalDB].[dbo].[MH_OpenSO_dev] 

  

Thanks for any feedback,

 

Michel

Categories:
I have the same question (0)
  • AlanPs1 Profile Picture
    2,471 on at

    Hi @MichelH

    Please see here

     

    User is reporting:

      "error": {

     

        "message": "BadGateway",

        "innerError": {

          "status": 504,

          "message": "The operation failed due to an explicit cancellation

    Then goes on to say: "We experienced this issue 8 times this morning. Opening a ticket with Microsoft to resolve. Seems like a Gateway code issue, or perhaps it has an issue connecting to SQL during some timeframes (Backups? DB Maint scripts?) Will post solution when Microsoft engages on the issue. We are collection the basic support info this morning on the case."

     

    Looks like they are engagning Microsoft, is this of any help to you?

    Maybe @SmartMeter can be asked for further assistance or information?

     

    If you, the reader has found this post helpful, please help the community by clicking thumbs up.

    If this post has solved your problem, please click "Accept as Solution".

    Any other questions, just ask.

    Thanks, Alan

  • MichelH Profile Picture
    401 on at

    Thanks for your response @AlanPs1,

     

    I did see that post but the issue occurred over a year ago and multiple updates to the gateway software have become available since.

    Also I can say that our gateway does work for short-lived jobs.

    As long as it doesn't go for the timeout, then all is fine.

     

    Nevertheless it can't hurt to request feedback from @SmartMeter regarding his issue, so I just did.

     

    Cheers,

    Michel

  • MichelH Profile Picture
    401 on at

    I'm more and more convinced the source of my problem is the hardcoded timeout (after 2 minutes) of a single request.

    There are timeout settings I can use in flow but they don't apply to a single request.

     

    timeout2min.jpg

     

    Does anyone have an idea how to work arround this?

     

    Could I download the flow package file and add something to the json to set this timeout to a longer period?

     

     

    Thanks,

     

    Michel

  • Community Power Platform Member Profile Picture
    on at

    Any recommendations on this? I am experiencing the same issue consistently.

  • MichelH Profile Picture
    401 on at

    No other than just making sure the SP responds in less than 2 minutes.

     

    For all the rest I made a python script of 10 lines that connects to the SQL and runs the procedures that take longer.

  • Cameron Profile Picture
    Microsoft Employee on at
    As others have mentioned, there is a 120 second timeout for SQL Server connector. If you cannot decrease the time taken by the stored procedure, then I suggest using a queueing mechanism on the server. If you are willing to take on some complexity, others have found success using a "fire and forget" mechanism. It assumes that you have access to the SQL server and have permission to create tables, stored procedures, and server-side triggers. The general approach is: 1) Create a stored procedure which performs the desired query (Master) make sure this procedure also accepts an identity value from the status table. 2) Create a second stored procedure (MasterStart) which can accept the same parameters as the first (sans identity). This procedure should add these parameters to a state table (let’s call it RunState) with an additional “status” column which defaults to “pending”. This table must have an IDENTITY column AND a ROWVERSION column. 3) Create a server-side SQL trigger which executes when new rows are inserted to the RunState table. When it discovers a new row with status “pending” it should execute the stored procedure created in step 1 using the parameters in the inserted row in addition to the identity of that row, and alter the status to “running”. The stored procedure in step 1 must update the state table using the provided identity to meaningful values, but most importantly to “complete” when it has finished. 4) Your Flow should then call the stored procedure created in step 2 above (MasterStart). This will return as soon as the state table entry is created in the RunState table. 5) Your second flow step should be a trigger which observes the RunState table for Updates. This step should filter out runs and only execute for rows whose status == “complete”. If desired, you can include a “results” column in the table to communicate output values. It’s also good to subsequently update the status to “closed” – meaning, you have successfully processed the completed run. I have also seen implementations that remove the state table row on completion after logging to a second table (ala RunHistory). The advantage of this approach is that the stored procedure can take as long as necessary, and it will not impact Flows or the connector.
  • MichelH Profile Picture
    401 on at

    Thanks Cameron,

     

    I had an SSIS package on the SQL server allowing me to run timed, but I think also triggered stored procedures.

    I no longer have this package on the new server, and was told to launch stored procedures using flow.

     

    Can SQL server 'by itself' (without additiona packages) have a 'server-side SQL trigger' ?

    I will check out if this is possible, or if there are any additional constraints.

     

    That said, I tried a kind of fire and forget method, but flow, after the time-out, actively engages to try and stop the stored procedure (because it thinks it has failed, I asume). If it would just let the stored procedure do its thing, it would work for me too. 

  • Cameron Profile Picture
    Microsoft Employee on at
    Michel, SQL Server side triggers allow you to kickoff a process that is not tied to a Flow request. That's really the only way to make sure the process isn't terminated after a timeout. That's the crux of "fire and forget" - it uses SQL server-side processing to execute the stored procedure. Anyone else who triggers the stored procedure would have to wait for it to complete, or terminate after some timeout. SQL Server doesn't have an async interface. If you do pursue an F-and-F approach: SQL Azure supports server-side triggers. Data Warehouse does not. See this reference: https://docs.microsoft.com/en-us/sql/t-sql/statements/create-trigger-transact-sql the workaround I described above allows you to do something beyond the norm - namely, run extra long stored procedures that cannot otherwise be limited to 120 seconds. To do this, I've described a method which requires some complexity and nuance - and isn't for the feint of heart. If you are limited to using flows, then you probably want to simply reduce the amount of data you are processing at once, and limit the amount of time your stored procedure is taking to complete. Thanks, -Cameron
  • Community Power Platform Member Profile Picture
    on at
    I say this is very amateur on Microsoft's behalf. Great thinking outside the box Cameron. Now there has to be away to do more than set and forget? Perhaps a timer to poll a table which keeps state so that you know when your state triggered stored procedure finishes.
  • Cameron Profile Picture
    Microsoft Employee on at

    Yes, you will notice that in my description of Fire-And-Forget approach, you can actually set your flow to watch for changes in the results table. You can then set a Flow trigger to fire when the result shows up (and therefore the stored procedure is complete).  The most important part of the workaround is the ability to disconnect the flow that triggers the stored proc. Once complete, you can use all of the usual mechanisms you are used to (such as flow triggers).

     

    Enjoy,

    Cameron 

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Forum hierarchy changes are complete!

In our never-ending quest to improve we are simplifying the forum hierarchy…

Ajay Kumar Gannamaneni – Community Spotlight

We are honored to recognize Ajay Kumar Gannamaneni as our Community Spotlight for December…

Leaderboard > Power Automate

#1
Michael E. Gernaey Profile Picture

Michael E. Gernaey 538 Super User 2025 Season 2

#2
Tomac Profile Picture

Tomac 405 Moderator

#3
abm abm Profile Picture

abm abm 252 Most Valuable Professional

Last 30 days Overall leaderboard