web
You’re offline. This is a read only version of the page.
close
Skip to main content

Notifications

Announcements

Community site session details

Community site session details

Session Id :
Power Platform Community / Forums / Copilot Studio / Vector search in copil...
Copilot Studio
Unanswered

Vector search in copilot studio

(1) ShareShare
ReportReport
Posted on by 12

Hi all,

 

Is it possible to use vector search in Copilot Studio? I have a knowledge database in multiple languages, so therefore vector search would be my preferred search method.

 

However, when I have configured "Azure OpenAI on your data" in the Azure OpenAI studio with vector search enabled, I cannot deploy to Copilot Studio and I have to disable the vector search.

 

Anyone encountering the same problem? Do you have some workaround?


Thank you so much for your help! 🙂

kirstvh_0-1714394135423.pngkirstvh_1-1714394169250.png

@HenryJammes 

 

Kind regards, 

Kirsten

Categories:
I have the same question (0)
  • remidyon Profile Picture
    on at

    Hello @kirstvh 

    Vector search is not yet supported by Copilot Studio when extending with Azure OpenAI.

    Stay tuned for our Microsoft Build event (21-23 May) for more details on the future roadmap of this feature.

  • YassineI Profile Picture
    55 on at

    Hey @kirstvh !

     

    There is a workaround for that 😊

     

    There is a built-in feature to perform HTTP calls directly from Copilot Studio. This powerful tool also allows you to parse the output of your call directly. You can use it to perform the RAG with your index. Alternatively, you can use PowerAutomate for this processing. Then you sent the results of the search index to a generative answers node. 

     

    1. Append a variable with the user's message (conversationHistory).

    2. Make an HTTP call to your GPT deployment to turn the user's message into a search query (you simply instruct your GPT model to convert the user's message into a short search query). Refer to Azure OpenAI Service REST API reference - Azure OpenAI | Microsoft Learn for more details.

    3. Make an HTTP call to your embedding model to convert the search query into a vector. The guide on How to generate embeddings with Azure OpenAI Service - Azure OpenAI | Microsoft Learn might be useful.

    4. Make an HTTP call to your index (hybrid search). Hybrid query - Azure AI Search | Microsoft Learn.

    5. Follow this tutorial to format the data for your generative answer node (Use a custom data source for generative answers - Microsoft Copilot Studio | Microsoft Learn). You will need to use some PowerFx (setVariable node) to parse the output of the hybrid search and add the conversationHistory. (GPT models are very good to generate PowerFx 😁 )

    6. Append your ConversationHistory variable with the generated answer.

    It's a bit tricky, but once set up, it works smoothly and the results are really improved. 

    I hope it helps ! 

     

    Yassine

  • kirstvh Profile Picture
    12 on at

    @YassineI Thank you for your insightful answer! This really helps a lot, I will try to implement your proposal. 😊

     

    Could you elaborate a bit on why we need the ConversationHistory variable and why we need to append it with the generated answer? Isn't this something that happens automatically? Thanks!

  • YassineI Profile Picture
    55 on at

    Great point! In fact, I did it, because I was trying to replace the generative response node with a GPT model.

    I think you're right, if you use the generative response node, Copilot Studio will do the work for you.

  • kirstvh Profile Picture
    12 on at

    @YassineI Okay, then we are completely aligned! Thank you so much again and have a great day! 😊

  • YassineI Profile Picture
    55 on at

    @kirstvh Now I remember why I did that !! 😁

     

    The goal was to provide a conversation history to the first step of your RAG to turn the user's request into a search query (in my previous message = step 2). Without context, some questions may not be properly addressed.

     

    Here's an example:

    • Q1: What is Power Automate?
    • R1: It's a thing in Power Platform etc.
    • Q2: Is it part of Fabric?
    • R2: The result will not be relevant because it lacks some context. It don't know about the previous question about power automate. The AI search index will not provide relevant chunks and then the result will be bad. 

    Voilà voilà ! Hope it helps !

  • CU23081425-0 Profile Picture
    on at
    Is vector search using an Azure OpenAI classic data connection still not supported directly in the Generative Answers node?  If not, I am curious why there are settings that imply otherwise?

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Forum hierarchy changes are complete!

In our never-ending quest to improve we are simplifying the forum hierarchy…

Ajay Kumar Gannamaneni – Community Spotlight

We are honored to recognize Ajay Kumar Gannamaneni as our Community Spotlight for December…

Leaderboard > Copilot Studio

#1
Michael E. Gernaey Profile Picture

Michael E. Gernaey 255 Super User 2025 Season 2

#2
Romain The Low-Code Bearded Bear Profile Picture

Romain The Low-Code... 205 Super User 2025 Season 2

#3
S-Venkadesh Profile Picture

S-Venkadesh 101 Moderator

Last 30 days Overall leaderboard