Skip to main content

Notifications

Community site session details

Community site session details

Session Id : jYKsBWhJCuDEPrEZbkJqZf
Copilot Studio - General
Unanswered

LLM configuration option

Like (0) ShareShare
ReportReport
Posted on 13 Jan 2025 14:22:45 by
How do we configure what LLM copilot studio uses to answer questions?
Categories:
  • Suggested answer
    Artur Stepniak Profile Picture
    1,526 Super User 2025 Season 1 on 14 Jan 2025 at 10:28:39
    LLM configuration option
    Hello,
     
    currently, it's not possible to choose the model that is being used in Copilot Studio. However, there's an upcoming feature that'll allow this - if I remember correctly with this feature you could choose models from Azure AI Foundry deployments.
     
    In case of any other questions, let me know. If the answer helped you, mark it, so that others can benefit from it.
     
    Best regards,
     
    Artur Stepniak
    Interested in GenAI? Visit my site!

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Announcing the Engage with the Community forum!

This forum is your space to connect, share, and grow!

🌸 Community Spring Festival 2025 Challenge Winners! 🌸

Congratulations to all our community participants!

Warren Belz – Community Spotlight

We are honored to recognize Warren Belz as our May 2025 Community…

Leaderboard > Copilot Studio - General

#1
Romain The Low-Code Bearded Bear Profile Picture

Romain The Low-Code... 99

#2
Pablo Roldan Profile Picture

Pablo Roldan 61

#3
Michael E. Gernaey Profile Picture

Michael E. Gernaey 28 Super User 2025 Season 1

Overall leaderboard
Loading started