
Announcements
Hi everyone,
I’m developing a Power Apps Code App (React + VS Code) connected to a Dataverse database.
Using pac code add-data-source, I connected my app to the Dataverse tables and generated the corresponding service classes. These services allow me to perform basic Dataverse operations (create, update, delete, get, getAll) using the authenticated Power Platform user.
However, I’m now facing limitations when dealing with more complex scenarios.
For example, suppose I have a GroupItem entity (with a corresponding Dataverse table) that contains a collection of Items (another table). The real scenario is more complex, but let’s keep it simple for clarity.
I need to duplicate a GroupItem along with all its related Items.
Currently, I can:
Retrieve the GroupItem
Loop through its Items
Call service.create() for each Item
Manage the parent-child relationships manually
The problem is that this approach is very slow because it results in one separate Dataverse call per Item.
So my questions are:
What is the recommended approach for handling heavy or complex Dataverse operations efficiently?
Would it be better to implement something like a server-side “DuplicateGroup” function (Custom API, Action, or similar) within the Power Platform and call it from my React app with parameters?
Ideally, I would also like to:
Execute this logic using the authenticated Power Platform user
Be able to trigger it on a schedule (without user interaction in the app)
What would be the best architectural approach for this scenario?
Thanks in advance for your help!
Fabrice
You’re running into a real Dataverse limitation, not a Code App problem — and your instinct is correct: client-side duplication is the wrong layer for this kind of workload.
What you’re seeing (one Web API call per child record) is exactly why Microsoft does not recommend handling complex entity graphs from Power Apps Code Apps or Canvas Apps.
For heavy or multi-record Dataverse operations, the correct pattern is:
Client (React / Power Apps) → single server-side Dataverse operation → bulk logic executed inside Dataverse
That server-side operation should be one of the following:
Your Code App is doing this:
React
├─ GET GroupItem
├─ GET Items (n records)
├─ POST Item #1
├─ POST Item #2
├─ POST Item #3
├─ ...
That means:
multiple HTTPS round trips
Dataverse throttling
no transaction scope
no batching support from pac services
no parallel execution guarantees
Even 100 related records becomes painfully slow.
This is exactly what Custom APIs were built for.
A Custom API called something like:
DuplicateGroupItem
Input parameters
GroupItemId (GUID)
Output
NewGroupItemId (GUID)
Implement logic using:
Plug-in registered to the Custom API
Dataverse SDK
ExecuteMultipleRequest
Example logic flow:
1. Retrieve GroupItem
2. Retrieve all related Items
3. Create new GroupItem
4. Clone Items in memory
5. Insert Items using ExecuteMultiple
6. Return new GroupItem Id
All of this runs:
inside Dataverse
in the same region
in a single transaction context
without HTTP latency
Performance difference is massive.
✔ One call from React
✔ Executes as authenticated Power Platform user
✔ Supports impersonation
✔ Can be reused anywhere
✔ Much faster
✔ Centralized business logic
✔ Transaction support
✔ Works with thousands of related records
This is exactly how Microsoft handles:
Quote → Quote Lines copy
Opportunity → Products
Case duplication
Order cloning
Custom APIs support:
CallerId execution
Full Dataverse security model
Row-level and column-level security
Audit history
So the records will show:
“Created by: John Smith”
—not by a service account.
Since the logic is server-side, you get this for free.
You can trigger the same Custom API from:
Power Automate (scheduled flow)
Azure Function
Logic Apps
Dataverse workflow
React Code App
No UI required.
From your Code App, you just call the Custom API endpoint:
POST /api/data/v9.2/DuplicateGroupItem
with:
{
"GroupItemId": "GUID"
}
That’s it.
One call. Everything else happens inside Dataverse.
| Option | Why not |
|---|---|
| Client-side looping | Slow, throttled |
| Canvas formulas | Not scalable |
| Power Automate per-record loops | Even slower |
| Virtual tables | Not transactional |
| Direct SQL | Not supported |
| pac service classes | CRUD only |
Use:
Custom API + plug-in → business operation
Plug-in alone → event-based logic (Create/Update)
For duplication, Custom API is the correct choice.
Power Apps Code App (React)
↓
Single call to Custom API
↓
Dataverse plug-in logic
↓
ExecuteMultiple bulk insert
↓
Return new GroupItemId
Yes — your assumption is correct.
✔ Do NOT duplicate complex entity graphs in the client
✔ Do NOT loop Dataverse calls from React
✔ Implement a server-side Custom API
✔ Execute logic inside Dataverse
✔ Call it from the app with parameters
✔ Reuse it from scheduled Power Automate flows
This is the same pattern Microsoft uses internally for all complex Dataverse operations.