We want to use our model and context in Azure AI Foundry in Copilot Studio, but knowledge covers everything. We want to restrict our chatbot by giving context as in Azure AI Foundry and we want it to use GPT. It answers every question we ask. How can we achieve this restriction with Copilot Studio? The chatbot I prepared in Copilot Studio is not working as efficiently as I want. I want to make a bot that works with GPT support, but I want to impose a restriction that can respond as much as the context I wrote in Azure AI Foundry, but I cannot restrict it. I prepared a Custom Connector OpenAI in HTTP Request PowerApps and it did not work. When I open Copilot Studio's own Knowledge, it answers every question. I am trying to integrate Azure Openai language models as a solution to this issue. it doesn't work. How can I integrate with Azure OpenAI?
Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.