Skip to main content

Notifications

Copilot Studio - Bot Extensibility
Answered

Retain conversation context when interacting with an Azure OpenAI ChatGPT model

(6) ShareShare
ReportReport
Posted on by

I previously shared how to maintain past context when interacting with an OpenAI GPT model, by setting and updating a global variable using Bot Framework Composer, here: Set and update global variables using Bot Framework Composer and how to retain conversation context when interacting with OpenAI GPT models.

 

Today I'd like to show how to do the same, but this time using Azure OpenAI ChatGPT model that just came out.

I find that model much better suited for conversation experiences and automatically formats code samples:

 

2023-03-14 15-41-50-815 (1)_ChatGPT_Bot__Microsoft_Teams_-_Charlie_Guibord.gif

 

Couple of useful resources on this topic:

Categories:
  • minakshimathpal Profile Picture
    minakshimathpal 32 on at
    Re: Retain conversation context when interacting with an Azure OpenAI ChatGPT model

    @mboninco I have created an FAQ bot. But my bot is facing a hard time in maintaining context. I am saving the entities provided by user in his response and thus I can use them. I was wondering is there a way to access/save bot's response in any variable. 

    To provide full context of a session to bot, the only solution I can think of is to append these to and fro messages in a string variable. But I am clueless how can I set/append this new variable with the texts being exchanged between the bot and the user. 

  • minakshimathpal Profile Picture
    minakshimathpal 32 on at
    Re: Retain conversation context when interacting with an Azure OpenAI ChatGPT model

    @HenryJammes How can this be done in MS copilot studio . I want my bot to maintain the context of the conversation, however currently it is not.

  • HenryJammes Profile Picture
    HenryJammes on at
    Re: Retain conversation context when interacting with an Azure OpenAI ChatGPT model

    FYI I've created a new version of this article using the Chat Completion API format, in both the classic and new version of Power Virtual Agents:

    Integrate a PVA chatbot with Azure OpenAI ChatGPT using the Chat Completion API format

  • angerfire1213 Profile Picture
    angerfire1213 90 on at
    Re: Retain conversation context when interacting with an Azure OpenAI ChatGPT model

    now It's worked good in our prod env.

    our PVA in teams version bot face to about 1000+ users

  • angerfire1213 Profile Picture
    angerfire1213 90 on at
    Re: Retain conversation context when interacting with an Azure OpenAI ChatGPT model

    I already use the same way to use azure openai gpt-4. thank you!

  • angerfire1213 Profile Picture
    angerfire1213 90 on at
    Re: Retain conversation context when interacting with an Azure OpenAI ChatGPT model

    @HenryJammes   Let me show you our different about our OpenAI chatGPT trubo

    Screenshot 2023-03-31 at 10.22.33 AM.png

     

     

  • angerfire1213 Profile Picture
    angerfire1213 90 on at
    Re: Retain conversation context when interacting with an Azure OpenAI ChatGPT model

    Thank you ! Hope you got Azure OpenAI GPT-4 soon...

  • HenryJammes Profile Picture
    HenryJammes on at
    Re: Retain conversation context when interacting with an Azure OpenAI ChatGPT model

    Hi @angerfire1213 the example above was created on the GPT 3.5 Turbo model with Chat Markup Language and works.

    I haven't tried the Chat Completion format yet, and GPT-4 might require that format, so the above approach would indeed need reworking to meet the new JSON format requirements, but the approach should be the same.

    I'll try to update or create a new article once I get access to GPT-4.

  • angerfire1213 Profile Picture
    angerfire1213 90 on at
    Re: Retain conversation context when interacting with an Azure OpenAI ChatGPT model

    How to do that for GPT-35-Turbo (preview) & GPT-4 (preview)   ?

    In GPT-35-Turbo (preview) & GPT-4 (preview) use messages not prompt

    messages=[ {"role": "system", "content": "Assistant is a large language model trained by OpenAI."}, {"role": "user", "content": "What's the difference between garbanzo beans and chickpeas?"}, ]

     

    learn more about GPT-35-Turbo (preview) & GPT-4 (preview):

    https://learn.microsoft.com/en-us/azure/cognitive-services/openai/how-to/chatgpt?pivots=programming-language-chat-completions#working-with-the-chatgpt-and-gpt-4-models-preview

     

  • Verified answer
    MattJimison Profile Picture
    MattJimison 577 on at
    Re: Retain conversation context when interacting with an Azure OpenAI ChatGPT model

    This is awesome, @HenryJammes !

     

    I just went through a very similar exercise, but am using Unified Canvas instead, so I thought I'd share the code view of my Fallback topic, since it's a breeze copy/pasting a topic in the Unified Canvas! 😀

     

    A few notes to get it set up:

    1. disable / remove any unnecessary existing topics to avoid triggering them if your intent is to handle all requests via ChatGPT
    2. The Power Automate Flow is the same as the example shown above with Bot Composer except I'm not passing the user's query into the flow and back out since I'm just concatenating with that value in the same topic.
    3. You'll need to get the ID of your flow in your environment and paste it into the part below where I've entered an empty Guid (all 0's) You could also manually recreate this part in the UI if desired
    4. Just open up your fallback topic, go to code view, replace the Flow Id as noted above, select all text, delete, and copy in the below

     

     

     

    kind: AdaptiveDialog
    beginDialog:
     kind: OnUnknownIntent
     id: main
     actions:
     - kind: SetVariable
     id: setVariable_Kyy1Yh
     variable: Topic.UserQuery
     value: =System.Activity.Text
    
     - kind: ConditionGroup
     id: conditionGroup_Nuj40I
     conditions:
     - id: conditionItem_iRCr9Y
     condition: =IsBlank(Global.FullConversation)
     actions:
     - kind: SetVariable
     id: 36cfgS
     variable: Global.FullConversation
     value: <|im_start|>system\nI am a virtual assistant that can answer questions\n<|im_end|>\n<|im_start|>user\n
    
     - kind: InvokeFlowAction
     id: invokeFlowAction_Ho6dKr
     input:
     binding:
     text: =Topic.UserQuery
     text_1: =Global.FullConversation
    
     output:
     binding:
     response: Topic.Response
    
     flowId: 00000000-0000-0000-0000-000000000000
    
     - kind: SendMessage
     id: sendMessage_UgHog9
     message: "{Topic.Response}"
    
     - kind: SetVariable
     id: setVariable_0LLSEn
     variable: Global.FullConversation
     value: =Concatenate(Global.FullConversation,Topic.UserQuery,"\n<|im_end|>\n<|im_start|>assistant", Topic.Response, "\n<|im_end|>\n<|im_start|>user\n")

     

     

     

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Microsoft Kickstarter Events…

Register for Microsoft Kickstarter Events…

Announcing Our 2025 Season 1 Super Users!

A new season of Super Users has arrived, and we are so grateful for the daily…

Announcing Forum Attachment Improvements!

We're excited to announce that attachments for replies in forums and improved…

Leaderboard

#1
WarrenBelz Profile Picture

WarrenBelz 145,666

#2
RandyHayes Profile Picture

RandyHayes 76,287

#3
Pstork1 Profile Picture

Pstork1 64,996

Leaderboard

Featured topics