Announcements
Is there a workaround to connect a SharePoint site to my Copilot Studio Bot, even if the site contains query strings and is more than two levels deep, which goes against the rule of not including query strings, more than two levels of depth, or the character “.” in the URL?I am concerned that connecting to the initial SharePoint site will result with a knowledge base that is too large for the bot, decreasing the accuracy of its responses, which is why I would like to connect only to a specific “sub-area” within SharePoint.
You can point the Gen Answers node at a specific document library. For example:
kind: AdaptiveDialog beginDialog: kind: OnUnknownIntent id: main actions: - kind: SearchAndSummarizeContent id: searchAndSummarizeContent_QakXNj userInput: =System.Activity.Text moderationLevel: Low additionalInstructions: publicDataSource: {} sharePointSearchDataSource: sites: - "https://mydomain.sharepoint.com/sites/EUHR/Invoices" customDataSource: {}
Thanks for the tip,But unfortunately this don't seem to work on the basis that the link to the specific document library consists of more than two levels of depth, and the character “.” .When i Add inn the SharePoint site in the Gen Answers node under "data source" as the only data source, It will not provide any answers even if the question is simple and tested to work if i use the Main sharepoint site instead.Though there's always the chance that I might have made a mistake and/or misunderstood your answer.
Can you share the SharePoint URL you are using with Gen Answers?
The URL below is a changed version of the real URL, adjustments have been made to secure potensial "sensitive info?". The URL i want to use as my bots knowledge base is similar to:companyname.sharepoint.com/teams/Firmname/Shared%20Documents/Forms/AllItems.aspx?FolderCTID=0x01202035D46F4DD00F1B1294A22D1A0ABa8623&id=%2Fteams%2FDatabasename%2FShared%20Documents%2FDB%20review%20blue%20C5_4%2FFB%20V09%2020%2E12%2E23%2Fdatabase%20G%20firm%26s%20documents%202023&viewid=fe50fa13-a6fb-4756-b274-b1f748da739bThe "Main SharePoint site" that i used as a test which works looks like:companyname.sharepoint.com/teams/Firmname/However, this may result in an excessively large database for the bot to handle.
I have the same issue - we have a deep SharePoint structure with actually many of our libraries created in teams. I cant use Copilot for M365 or Copilot Studio to reference these libraries as it only supports two levels deep. Anyone found a workaround please.
There is not workaround that I know of as much as a custom pattern: build a Power Automate flow that would query Graph API and feed the results to "custom data" in the Gen Answers node.
This video is a good place to start.
This does seem like a good workaround when referencing a web page which is an article, but I'm hesitant that this would work on a much larger library of files and folders.Even if this would work, I'm guessing it would need to search through all for each response which would take a long time, and it would not be "trained" on the documents, rather searching and extracting for each user question.In my case, I don't think this solution would be viable.
Not sure I follow - when you implement a RAG pattern only the N top results are being fed to the LLM for summarization. This is also how Gen Answers work OOTB.
yes, and if i have a large library, the chance that some of the N top results are not the exact that i want. which is why i want to utilize a smaller library (which also consists of multiple documents and folders, though not as many as the main library). But according to the video, the example only utilized a webpage consisting of a single article, which is why i think that solution might not be viable for my case.Though I might be wrong, or misunderstanding some key aspect.
That might be true, but it's also true of almost any RAG pattern - not just the workaround I suggested. If you build your own query to Graph API, you could try to add filters based on your use case to reduce the population of relevant results.
Under review
Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.
Congratulations to our 2026 Super Users!
Congratulations to our 2025 community superstars!
These are the community rock stars!
Stay up to date on forum activity by subscribing.
Valantis 645
Vish WR 234
Haque 211