Skip to main content

Notifications

Community site session details

Community site session details

Session Id :
Power Automate - Using Connectors
Unanswered

SQL stored procedure on-premises data gateway timeout

(1) ShareShare
ReportReport
Posted on by 400

I found  I can run an SQL stored procedure on the local SQL server and it is very handy when the procedures are short lived. However, when the time the procedure takes to run, exceeds 2 minutes, the Flow seems to do a couple of extra things that I do not understand.

 

When I manually run the SP that I want to launch via Flow, it takes 5 to 7 minutes to run, where it basically pulls data from a number of sources (linked servers) via a local view, and combines them into a local table. This always works as expected, there is no issue with the SP.

 

When I run the SP via Flow it does do the first part of the flow, where it drops the existing table, and then things become unclear. After 2 minutes it signals a timeout, and from then on it seems to hammer the SQL server, since my SQL studio seems to have a hard time getting any response. The flow and the sever then go in limbo for about 30 minutes and seemly do nothing but jamming.

 

The flow is running with my personal credentials, the same as I use in the SQL Studio. 

 

Is there anything I can do to prevent this timeout and retry after 2 minutes?

Is there anything I can do to prevent the explicit cancellation?

Any suggestions to make this work as expected?

 

I did look at and adjusted the timeout settings, but I think the 'Note' tells me it will not work. 

Timeout:  Limit the maximum duration an asynchronous pattern may take.
Note: this does not alter the request timeout of a single request.

 

GatewayTimeout.jpg

 

The error message: 

{
"error": {
"code": 504,
"source": "flow-apim-europe-001-northeurope-01.azure-apim.net",
"clientRequestId": "bba12345-a123-b456-c789-cf64d495e8d1",
"message": "BadGateway",
"innerError": {
"status": 504,
"message": "The operation failed due to an explicit cancellation. Exception: System.Threading.Tasks.TaskCanceledException: A task was canceled.\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.PowerBI.DataMovement.Pipeline.Common.TDFHelpers.<>c__DisplayClass7_0`1.<<GetNextResponse>b__0>d.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.PowerBI.DataMovement.Pipeline.Common.TDFHelpers.<>c__DisplayClass11_0.<<ExecuteBlockOperation>b__0>d.MoveNext()\r\n inner exception: The operation failed due to an explicit cancellation. Exception: System.Threading.Tasks.TaskCanceledException: A task was canceled.\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.PowerBI.DataMovement.Pipeline.Common.TDFHelpers.<>c__DisplayClass7_0`1.<<GetNextResponse>b__0>d.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.PowerBI.DataMovement.Pipeline.Common.TDFHelpers.<>c__DisplayClass11_0.<<ExecuteBlockOperation>b__0>d.MoveNext()\r\n inner exception: A task was canceled.\r\nclientRequestId: bba12345-a123-b456-c789-cf64d495e8d1",
"source": "sql-ne.azconn-ne.p.azurewebsites.net"
}
}
}

 

The stored procedure:

DROP TABLE IF EXISTS [LocalDB].[dbo].[MH_test_dev]

SELECT * INTO [LocalDB].[dbo].[MH_test_dev] FROM [LocalDB].[dbo].[MH_OpenSO_dev] 

  

Thanks for any feedback,

 

Michel

  • Cameron Profile Picture
    Microsoft Employee on at
    Re: SQL stored procedure on-premises data gateway timeout

    Long-running Stored Procedures for Power Platform SQL Connector

     

    The SQL Server connector in Power Platform exposes a wide range of backend features that can be accessed easily with the Logic Apps interface, allowing ease of business automation with SQL database tables.  However, the user is still limited to a 2-minute window of execution.  Some stored procedures may take longer than this to fully process and complete. In fact, some long-running processes are coded into stored procedures explicitly for this purpose. Calling them from Logic Apps is problematic because of the 120-second timeout. While the SQL connector itself does not natively support an asynchronous mode, it can be simulated using passthrough native query, a state table, and server-side jobs.

    For example, suppose you have a long-running stored procedure like so:

     

    CREATE PROCEDURE [dbo].[WaitForIt]
     @delay char(8) = '00:03:00'
    AS 
    BEGIN 
    SET NOCOUNT ON;
     WAITFOR DELAY @delay
    END

     

    Executing this stored procedure from a Logic App will cause a timeout with an HTTP 504 result since it takes longer than 2 minutes to complete. Instead of calling the stored procedure directly, you can use a job agent to execute it asynchronously in the background. We can store inputs and results in a state table that you can target with a Logic App trigger. You can simplify this if you don’t need inputs or outputs, or are already writing results to a table inside the stored proc.

    Keep in mind that the asynchronous processing by the agent may retry your stored procedure multiple times in case of failure or timeout. It is therefore critically important that your stored proc be idempotent. You will need to check for the existence of objects before creating them and avoid duplicating output.

     

    For SQL Azure

    An Elastic Job Agent can be used to create a job which executes the procedure. Full documentation for the Elastic Job Agent can be found here: https://docs.microsoft.com/en-us/azure/azure-sql/database/elastic-jobs-overview

    You’ll want to create a job agent in the Azure Portal. This will add several stored procedures to a database that will be used by the agent.  This will be known as the “agent database”. You can then create a job which executes your stored procedure in the target database and captures the output when it is completed. You’ll need to configure permissions, groups, and targets as explained in the document above. Some of the supporting tables and procedures will also need to live in the agent database.

    First, we will create a state table to register parameters meant to invoke the stored procedure. Unfortunately, SQL Agent Jobs do not accept input parameters, so to work around this limitation we will store the inputs in a state table in the target database. Remember that all agent job steps will execute against the target database, but job stored procedures run on the agent database.

     

    CREATE TABLE [dbo].[LongRunningState](
     [jobid] [uniqueidentifier] NOT NULL,
     [rowversion] [timestamp] NULL,
     [parameters] [nvarchar](max) NULL,
     [start] [datetimeoffset](7) NULL,
     [complete] [datetimeoffset](7) NULL,
     [code] [int] NULL,
     [result] [nvarchar](max) NULL,
     CONSTRAINT [PK_LongRunningState] PRIMARY KEY CLUSTERED
    ( [jobid] ASC
    )WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF) ON [PRIMARY]
    ) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

     

    The resulting table will look like this in SSMS:

    Cameron_0-1602871250891.png

    We will use the job execution id as the primary key, both to ensure good performance, and to make it possible for the agent job to locate the associated record. Note that you can add individual columns for input parameters if you prefer. The schema above can handle multiple parameters more generally if this is desired, but it is limited to the size of NVARCHAR(MAX).

    We must create the top-level job on the agent database to run the long-running stored procedure.

     

    EXEC jobs.sp_add_job
     _name='LongRunningJob',
     @description='Execute Long-Running Stored Proc',
     @enabled = 1

     

    We will also need to add steps to the job that will parameterize, execute, and complete the stored procedure. Job steps have a default timeout value of 12 hours. If your stored procedure will take longer, or you’d like it to timeout earlier, you can set the step_timeout_seconds parameter to your preferred value in seconds. Steps also have (by default) 10 retries with a built in backoff timout inbetween. We will use this to our advantage.

     

    We will use three steps:

    This first step waits for the parameters to be added to the LongRunningState table, which should occur fairly immediately after the job has been started. The first step merely fails if the jobid hasn’t been inserted to the LongRunningState table, and the default retry/backoff will do the waiting for us. In practice, this step typically runs once and succeeds.

     

    EXEC jobs.sp_add_jobstep
     _name='LongRunningJob',
     _name= 'Parameterize WaitForIt',
     @command= N'
     IF NOT EXISTS(SELECT [jobid] FROM [dbo].[LongRunningState]
     WHERE [jobid] = $(job_execution_id)) 
     THROW 50400, ''Failed to locate call parameters (Step1)'', 1', 
     @credential_name='JobRun',
     @target_group_name='DatabaseGroupLongRunning'

     

    The second step queries the parameters from the state table and passes it to the stored procedure, executing the procedure in the background. In this case, we use the @callparams to pass the timespan parameter, but this can be extended to pass additional parameters if needed. If your stored procedure does not need parameters, you can simply call the stored proc directly.

     

    EXEC jobs.sp_add_jobstep
     _name='LongRunningJob',
     _name='Execute WaitForIt',
     @command=N'
     DECLARE @timespan char(8)
     DECLARE @callparams NVARCHAR(MAX)
     SELECT @callparams = [parameters] FROM [dbo].[LongRunningState]
     WHERE [jobid] = $(job_execution_id)
     SET @timespan = @callparams
     EXECUTE [dbo].[WaitForIt] @delay = @timespan', 
     @credential_name='JobRun',
     @target_group_name='DatabaseGroupLongRunning'

     

    The third step completes the job and records the results:

     

    EXEC jobs.sp_add_jobstep
     _name='LongRunningJob',
     _name='Complete WaitForIt',
     _timeout_seconds = 43200,
     @command=N'
     UPDATE [dbo].[LongRunningState]
     SET [complete] = GETUTCDATE(),
     [code] = 200,
     [result] = ''Success''
     WHERE [jobid] = $(job_execution_id)',
     @credential_name='JobRun',
     @target_group_name='DatabaseGroupLongRunning'

     

    We will use a passthrough native query to start the job, then immediately push the parameters into the state table for the job to reference. We will use the dynamic data output ‘Results JobExecutionId’ as the input to the ‘jobid’ attribute in the target table. We must add the appropriate parameters for the job to unpackage them and pass them to the target stored procedure.

    Here's the native query:

     

    DECLARE @jid UNIQUEIDENTIFIER
    DECLARE @result int
    EXECUTE @result = jobs.sp_start_job 'LongRunningJob', @jid OUTPUT
     IF @result = 0
     SELECT 202[Code], 'Accepted'[Result], @jid[JobExecutionId]
     ELSE
     SELECT 400[Code], 'Failed'[Result], @result[SQL Result]

     

    Here's the Logic App snippet:

    Cameron_1-1602871250902.png

     

    When the job completes, it updates the LongRunningState table so that you can easily trigger on the result. If you don’t need output, or if you already have a trigger watching an output table, you can skip this part.

    Cameron_2-1602871250915.png

     

     

    For SQL Server on-premise or SQL Azure Managed Instances:

    SQL Server Agent can be used in a similar fashion. Some of the management details differ, but the fundamental steps are the same.

    https://docs.microsoft.com/en-us/sql/ssms/agent/configure-sql-server-agent

     

  • Cameron Profile Picture
    Microsoft Employee on at
    Re: SQL stored procedure on-premises data gateway timeout

    The implementation here is admittedly tricky. I have an updated recommendation to use the Elastic Job Agent which, while requiring a new Azure feature, is much less error-prone, and much more straightforward. It still requires use of the state table, since jobs cannot accept input or output parameters.  This document will be published soon (it is in review) but I provide samples that can be easily built on for any stored procedure. You can also use the agent in on-premise SQL or Managed Instance to accomplish the same thing. I will post a link here when the document is published.

  • Community Power Platform Member Profile Picture
    on at
    Re: SQL stored procedure on-premises data gateway timeout

    4) Your Flow should then call the stored procedure created in step 2 above (MasterStart). This will return as soon as the state table entry is created in the RunState table.

     

    Some advice here, please. When I do this the store procedure waits for the entry to be created. The entry to be created status = "Pending" waits for the Master stored procedure to finish (20 mins) and thus I am still stuck by the same timeout issue.

  • Community Power Platform Member Profile Picture
    on at
    Re: SQL stored procedure on-premises data gateway timeout

    I retract my statement. 

    1. Using a statetable to start a storedproc. Check

    2. Updating the statetable with "Complete" when finished. Check

    3. Using a flow statement to monitor SQL "Modified" trigger. Trouble ahead!

     

     

  • Cameron Profile Picture
    Microsoft Employee on at
    Re: SQL stored procedure on-premises data gateway timeout

    Hi Javier,

     

    Yes, the idea here is to have a server side trigger which executes the stored procedure using parameters inserted into the state table. When the stored proc finishes, it can update the same table with results and update the status. It can also insert result set(s) into a second table, which you can trigger on from LogicApps. I think you've got the gist of it.

     

    Thanks,

    Cameron

     

  • javisoft Profile Picture
    4 on at
    Re: SQL stored procedure on-premises data gateway timeout

    Hi Cameron,

    Thanks for the link. I have no issues creating the triggers. When I followed your approach, you mentioned that the "key" was to create a "server-side" trigger, and I can't find how to create a "DML server-side" triggers. In other words, how do you create a DML trigger that is "triggered" at the server level instead of the table level? Only DDL and Logon triggers are saved either in the "Stored Procedure" section or the server section. 

    When I create a DML trigger (table level) to run a stored procedure that takes more than 2 min, I don't get the results back until the SP is finished. I think I'm missing something since I see others were able to follow your steps. 

    Anyhow, I found a workaround where the DML trigger creates a Sql Job that calls the SP. The Job is deleted once the SP is executed. It simulates an async operation that is tracked by the status field in the table you mentioned in your approach. 

    Thanks again for your reply.

    Best,

    Javier

     

  • Cameron Profile Picture
    Microsoft Employee on at
    Re: SQL stored procedure on-premises data gateway timeout

    Hey there Javier - check out this link,

    It has a lot of detail and some good examples.

  • Community Power Platform Member Profile Picture
    on at
    Re: SQL stored procedure on-premises data gateway timeout

    Sorry. It didn't work for me.

  • javisoft Profile Picture
    4 on at
    Re: SQL stored procedure on-premises data gateway timeout

    Hi Cameron,

     

    Could you please show how you create a server-side DML trigger? The fire and forget approach sounds like a winner, but I can't figure out how to create the server-side trigger. 

     

    Thanks in advance, 

    Best,

    Javier

     

     

  • Cameron Profile Picture
    Microsoft Employee on at
    Re: SQL stored procedure on-premises data gateway timeout

    Yes, you will notice that in my description of Fire-And-Forget approach, you can actually set your flow to watch for changes in the results table. You can then set a Flow trigger to fire when the result shows up (and therefore the stored procedure is complete).  The most important part of the workaround is the ability to disconnect the flow that triggers the stored proc. Once complete, you can use all of the usual mechanisms you are used to (such as flow triggers).

     

    Enjoy,

    Cameron 

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Michael Gernaey – Community Spotlight

We are honored to recognize Michael Gernaey as our June 2025 Community…

Congratulations to the May Top 10 Community Leaders!

These are the community rock stars!

Announcing the Engage with the Community forum!

This forum is your space to connect, share, and grow!

Leaderboard > Power Automate

#1
Michael E. Gernaey Profile Picture

Michael E. Gernaey 566 Super User 2025 Season 1

#2
David_MA Profile Picture

David_MA 516 Super User 2025 Season 1

#3
stampcoin Profile Picture

stampcoin 492

Featured topics