web
You’re offline. This is a read only version of the page.
close
Skip to main content

Notifications

Announcements

Community site session details

Community site session details

Session Id :
Power Platform Community / Forums / Power Automate / Check the Status and G...
Power Automate
Suggested Answer

Check the Status and Get Results SnowFlake connector not working

(1) ShareShare
ReportReport
Posted on by 78
 
Snowflake Submit SQL Statement for Execution connector splits the Data into separate partitions when the total rowcount is above ~500. This is a known issue - https://github.com/microsoft/PowerPlatformConnectors/issues/2947
 
 
 
The workaround suggests to use the Check the Status and Get Results connector with the StatementHandle from the previous Submit SQL Statement for Execution step, and loop through all the partitions. 
 
 
 
However, the Check the Status and Get Results connector only works when setting partition = 0. When I try any other parition (1+), it gives a generic InternalServerError. 
 
 
  
[
  {
    "RowCount": 368,
    "UncompressedSize": 100173
  },
  {
    "RowCount": 2282,
    "UncompressedSize": 619773
  },
  {
    "RowCount": 382,
    "UncompressedSize": 99956
  },
  {
    "RowCount": 2309,
    "UncompressedSize": 625931
  },
  {
    "RowCount": 377,
    "UncompressedSize": 100136
  },
  {
    "RowCount": 2265,
    "UncompressedSize": 602687
  },
  {
    "RowCount": 374,
    "UncompressedSize": 100085
  },
  {
    "RowCount": 2277,
    "UncompressedSize": 616803
  },
  {
    "RowCount": 367,
    "UncompressedSize": 100181
  },
  {
    "RowCount": 2295,
    "UncompressedSize": 625229
  },
  {
    "RowCount": 363,
    "UncompressedSize": 100025
  },
  {
    "RowCount": 2316,
    "UncompressedSize": 634610
  },
  {
    "RowCount": 364,
    "UncompressedSize": 100180
  },
  {
    "RowCount": 2306,
    "UncompressedSize": 630971
  },
  {
    "RowCount": 360,
    "UncompressedSize": 100060
  },
  {
    "RowCount": 2346,
    "UncompressedSize": 647986
  }
]

 
 
 
Categories:
I have the same question (0)
  • VASANTH KUMAR BALMADI Profile Picture
    266 on at

    Hi,

    This issue with the "Submit SQL Statement for Execution" connector in Snowflake, where partitions beyond 0 fail when using the "Check the Status and Get Results" connector, is indeed challenging. Below is a step-by-step approach to troubleshoot and apply the workaround effectively.

    Steps to Resolve

    1. Confirm Statement Execution

    • Ensure the SQL statement is successfully executed and partitions are being created.
    • Retrieve the StatementHandle and validate the partition metadata returned from the "Submit SQL Statement for Execution" step.

    2. Verify Partition Metadata

    • Use the output metadata to confirm the RowCount and Partition IDs (0 to n).
    • Example Metadata Output:
      [
      { "RowCount": 368, "UncompressedSize": 100173 },
      { "RowCount": 2282, "UncompressedSize": 619773 },
      ...
      ]
      • Ensure you loop through the exact number of partitions indicated in this metadata.

    3. Use the Loop to Process Partitions

    • Implement a loop to iterate through all partitions.
    Implementation in Power Automate:
    1. Initialize Variables:

      • Use an Initialize Variable action to store the partition count (length of the metadata array).
    2. Apply to Each:

      • Loop through the array of metadata.
      • Retrieve the Partition ID dynamically for each iteration.
    3. Connector Call with Partition ID:

      • In each loop iteration, use the "Check the Status and Get Results" connector with:
        • StatementHandle: From the SQL execution step.
        • Partition ID: Dynamically passed from the loop.
    4. Aggregate Results:

      • Append each partition’s data to a final variable or process it incrementally.

    4. Debugging InternalServerError for Non-Zero Partitions

    • The error could occur due to:
      1. Incorrect Partition ID being passed (ensure valid IDs are retrieved from metadata).
      2. Permissions or execution context issues in Snowflake for certain partitions.

    Troubleshooting Steps:

    1. Log Outputs:
      Add logging to capture:

      • The Partition ID being passed.
      • The response or error details from Snowflake.
    2. Validate Partitions in Snowflake:

      • Manually query Snowflake's execution results to ensure data is split across all expected partitions.

      Example SQL (if applicable):

      SELECT *
      FROM TABLE(INFORMATION_SCHEMA.QUERY_HISTORY())
      WHERE QUERY_ID = '<your-query-id>';
    3. Test Individual Partitions:

      • Test the "Check the Status and Get Results" connector with specific Partition IDs (e.g., 1, 2, etc.) to isolate the issue.

    5. Workaround: Recombine Data in Partition 0

    If partitions beyond 0 are inaccessible, consider requesting Snowflake support for further investigation. Meanwhile, recombine data in partition 0 using Snowflake queries if feasible:

    • Aggregate all data back to a single partition using a UNION in the SQL query before execution.

    Additional Recommendations

    • Contact Microsoft/Power Platform Support:
      Report this issue via Microsoft Power Platform or GitHub (if not already tracked) to ensure it’s acknowledged.

    • Use Alternative Approaches:

      • Query results directly from Snowflake using custom API calls if possible.
      • Avoid splitting into partitions for smaller datasets where feasible.

    Example of Loop in Power Automate

    1. Get Metadata and Initialize Loop:

      • Fetch the partition metadata from Submit SQL Statement for Execution.
      • Use a variable or length() expression to determine the total number of partitions.
    2. Apply to Each Partition:

      • Loop through partitions with the following dynamic parameters:
        • StatementHandle: Fixed value from the SQL execution step.
        • Partition ID: Dynamic value from the current iteration.
    3. Process Results:

      • Append partition results to a variable or process them in real time.
    ----------------------------------------------------------------------------------
    If this Post helped you, please click "Does this answer your question" and give it a like to help others in the community find the answer too!
  • Suggested answer
    drewbty Profile Picture
    78 on at
    I have now got this to work by explicitly mapping out the schema info for all the columns in the dataset. You don't need to do this for partition = 0, but it is oddly required to return data from partitions >= 1. You can get the full schema info from the earlier Submit SQL Statement for Execution step. The important part seems to just be that the column names are included
     
    So you get the numbers of partitions with @{length(body('Submit_SQL_Statement_for_Execution')?['Partitions'])} and then do a Do Until loop to cycle through the partitions with the Check the Status and Get Results connector
     
     
  • CU10070245-0 Profile Picture
    4 on at
    Thank you for posting your answer - I was having the same issue, and setting the DataSchema got it working!

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Forum hierarchy changes are complete!

In our never-ending quest to improve we are simplifying the forum hierarchy…

Ajay Kumar Gannamaneni – Community Spotlight

We are honored to recognize Ajay Kumar Gannamaneni as our Community Spotlight for December…

Leaderboard > Power Automate

#1
Michael E. Gernaey Profile Picture

Michael E. Gernaey 501 Super User 2025 Season 2

#2
Tomac Profile Picture

Tomac 323 Moderator

#3
abm abm Profile Picture

abm abm 237 Most Valuable Professional

Last 30 days Overall leaderboard