the nature of LLM is that you can always receive a different response, even by using the same prompts. Overall, the cause would be that there are some metaprompts given to the system prompt when the bot is deployed to Teams channel. The only way to overcome this for is to just simply prepare the bot so that it's stable in both channels. Test, test and... test. :-) We also need to be paitient as it's a new technology and a lot is changing behind the scenes.
In case of any other questions, let me know. If the answer helped you, mark it, so that others can benefit from it.
Best regards,
Artur Stepniak
Was this reply helpful?YesNo
Under review
Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.