web
You’re offline. This is a read only version of the page.
close
Skip to main content

Announcements

News and Announcements icon
Community site session details

Community site session details

Session Id :
Power Platform Community / Forums / Copilot Studio / File upload in chat li...
Copilot Studio
Answered

File upload in chat limitation Conversational vs Declarative.

(0) ShareShare
ReportReport
Posted on by 9
Hi all, 

Does anyone have an explanation of the limitations of the file upload into the chat feature. I have a use case that requires users to upload large (~25mb) word documents to the chat to be processed inside a topic / power automate workflow, according to other posts I've seen on here it should be possible by passing the file in an AI builder node inside a power automate flow. But it seems as even though I'm using the recommended approach any attempt to deal with a file larger then maybe 15mb, either errors or just resets the chat window. 

I've tried processing the document with just the orchestrator inside a declarative agent built inside copilot studio and it processed the 25mb word doc without issue. So I'm a bit confused where the limitation lies. 

Cheers, 
I have the same question (0)
  • Verified answer
    Assisted by AI
    Prasad-MSFT Profile Picture
    Microsoft Employee on at
    File Upload Limitations in Copilot Studio & Power Automate
    File Size Limits in Copilot Studio Chat
    The chat interface in Copilot Studio has a practical file upload limit, often around 15–20MB per file. This is not always well-documented, but larger files can cause errors, timeouts, or even reset the chat session.
    The limit is enforced to ensure performance, prevent abuse, and avoid overloading the backend services.
    Power Automate Flow Limits
    Power Automate has its own file size limits for triggers and actions. For example, the “When a file is created” or “Get file content” actions may have limits (often 15–100MB, depending on the connector and licensing).
    AI Builder actions (like “Extract information from documents”) also have file size and page count limits, which can vary by model and region. For Word documents, the limit is often 20MB, but practical performance may degrade before that.
    Timeouts and Memory Constraints
    Even if a file uploads, processing large files can hit timeouts or memory limits in Power Automate or Copilot Studio, especially in a chat context where synchronous processing is expected.
    Declarative Agent vs. Chat Upload
    When you process a file with an orchestrator or declarative agent (not via chat), the system may use a different pipeline or have higher limits, which is why your 25MB file worked there.
    The chat upload feature is more restrictive due to real-time user experience and resource management.
    ----------------------------------------------------------------------------------------------------------------------------
    Recommendations
    • Keep files under 15MB for reliable chat uploads.
    • For larger files, use a different entry point:
    • Upload the file to SharePoint, OneDrive, or Azure Blob Storage, then pass a link to the bot or Power Automate flow for processing.
    • Trigger the workflow outside of chat (e.g., via a web form, email, or direct upload to storage).
    • If you must process large files in chat, consider splitting them or preprocessing them before upload.
  • MS.Ragavendar Profile Picture
    6,961 Super User 2026 Season 1 on at
    @CU27020548-0, You can kindly refer the quotas as limits of copilot studio from Microsoft.
     
     
    ✅If this helped, please Accept as Solution to help others ❤️ A Like is appreciated 🏷️ Tag @MS.Ragavendar for follow-ups.
  • CU27020548-0 Profile Picture
    9 on at
    @MS.Ragavendar Thanks for your input, however the link you have sent doesn't reference the limitation around the user uploading a file to the chat for querying not to be used as a knowledge source. Unless I just missed it apologies if I did. 

    @Prasad-MSFT Thankyou for your detailed answer, would the recommended pipeline for large files then be Trigger of sharepoint / blob / onedrive upload -> Ai search to index the file -> AI node to query the index? Is it possible to force sharepoint to refresh its index if we just relied on uploading the file to sharepoint then the auto indexer doing its thing.  

    On a related topic do you know what the best approach is for using sharepoint sites with large number a large number of folders (400+) as a knowledge source. From the link @MS.Ragavendar provided it seems there is a cap of 50 folders for a sharepoint knowledge source, is the limitation the same if we have tennant graph grounding enabled? 

    Thanks 
  • BH-05032049-0 Profile Picture
    2 on at
    Hey @Prasad-MSFT thanks for the answer. However, this seems like a major architectural flaw in Copilot Studio. All of my clients are just shocked that a Studio-built agent does not accept Office documents. I can't go back to them with your explanation. It just sounds like excuses, not a sensible reason. Microsoft has released a product that works great with Office files (and conversations) in the Declarative agent. Then you release a limited product in Studio, but which has other advantages. Does Microsoft plan to converge these?
  • Prasad-MSFT Profile Picture
    Microsoft Employee on at

     Yes, the architecture you mentioned is generally the recommended approach for handling larger files. Since chat uploads have practical size limits, a common pattern is:

    • Upload the document to SharePoint / OneDrive / Azure Blob Storage

    • Trigger a Power Automate flow on upload

    • Index or process the document (e.g., using Azure AI Search)

    • Query the indexed content from the Copilot agent

    Regarding SharePoint indexing, it runs automatically through Microsoft Search and there isn't currently a way to manually force a re-index from Copilot Studio. Newly uploaded files are usually indexed after some time, though indexing latency can vary.

     

    For SharePoint knowledge sources, the documented limit is 50 folders per source. If you have a large structure (e.g., 400+ folders), you may want to:

    • Split content across multiple knowledge sources

    • Use document libraries or site-level sources

    • Consider Azure AI Search or Graph grounding for larger repositories.

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Introducing the 2026 Season 1 community Super Users

Congratulations to our 2026 Super Users!

Kudos to our 2025 Community Spotlight Honorees

Congratulations to our 2025 community superstars!

Congratulations to the April Top 10 Community Leaders!

These are the community rock stars!

Leaderboard > Copilot Studio

#1
Valantis Profile Picture

Valantis 876

#2
Vish WR Profile Picture

Vish WR 327

#3
Haque Profile Picture

Haque 289

Last 30 days Overall leaderboard