web
You’re offline. This is a read only version of the page.
close
Skip to main content

Notifications

Announcements

Community site session details

Community site session details

Session Id :
Power Platform Community / Forums / Power Apps / Skip Token variable ha...
Power Apps
Unanswered

Skip Token variable handling large volume of data

(0) ShareShare
ReportReport
Posted on by 13

Hi All,

 

I'm trying to use Skip token feature for splitting the total data count to batches , so that we can enable parallelism.

We set  the Apply to each action Concurrency control to 20, but still the performance didn't improve.

Scenario : We need to process 20k records in dataverse(trying to update) using Power Automate, but while using Apply to each it's taking 50 mins approx for processing ,

 

A) How to improve the performance for this scenario ,let say reduce the processing time

B) How to achieve this in dynamic Parallelism using batches 

 

Appreciate your suggestion.

I have the same question (0)
  • cchannon Profile Picture
    4,702 Moderator on at

    So, really the answer depends on the service you're trying to call (I am guessing Graph API?), but almost certainly Skip Token isn't going to open you up for parallelization. Skip Token tells the API what the last page was you called so it knows to start you on the next. So, if you call for (let's just call it) page one and get Skip Token XXXXXX, then call that same API 1, 20, or 2000 times with Skip Token XXXXXX, it will always give you page two. So, if you're pulling 100 records per page, then on each iteration each of your 20 threads is just running the same 100 records.

     

    Instead, take a look at the API documentation for whatever service you are trying to call. See if it offers you a $skip or similar query parameter that could allow you to skip a set number of rows per call. If so, then you can set each of your threads to skip a different number of rows, forcing them to look at different pages instead of all at the same one.

  • KishoreJ Profile Picture
    13 on at

    Thanks @cchannon  for your response, but the real issue here is  the parallelism is not working as expected.

     

    In our scenario we use skip token variable for holding the next page token value, with that value we are creating 4 array condition to execute concurrently.

    First Array - 1 to 5k

    Second Array - 5001 to 10k

    Third Array - 10001 to 15 k

    Fourth Array  - 15001 to 20 K

    Here even after the process got completed flow continues to run, if you see the flow history First Array is not stopping even after execution completed. It is running for a while and then it is stopping.

  • rtabit Profile Picture
    2 on at

    Those 4 arrays aren't executing in parallel if you are using skip token. You use the skip token from the first array to create the second array, third array needs seconds token before it can get created, and so on. The apply to each is working in parallel one array at a time. The only way apply to each is going to be noticeably faster is when all your data is going to separate places, it doesn't fix database waits. Try running it without doing anything inside the apply to each, if its fast than the issues is on the db insert side.

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Forum hierarchy changes are complete!

In our never-ending quest to improve we are simplifying the forum hierarchy…

Ajay Kumar Gannamaneni – Community Spotlight

We are honored to recognize Ajay Kumar Gannamaneni as our Community Spotlight for December…

Leaderboard > Power Apps

#1
WarrenBelz Profile Picture

WarrenBelz 739 Most Valuable Professional

#2
Michael E. Gernaey Profile Picture

Michael E. Gernaey 343 Super User 2025 Season 2

#3
Power Platform 1919 Profile Picture

Power Platform 1919 268

Last 30 days Overall leaderboard