Hi all
Why the copilot gives different answers every time?
It has the right information about the answer. It also has a topic about it with the exact phrase, but it offered it as "what did you mean" only one time, and one of the answers was even "I don't know", although it knows and sometimes it gives the right answer.
Thank you!
Good morning,
That is unfortunately how generative AIs work right now. Try a little experiment : go to chatGPT and open two chat windows. Ask the same exact question, you will get different answers.
In short : when you ask a question to a generative AI, it doesn't "understand" the question but tries to find the most statistically plausible answer given the context. It's a bit like if you were building a sentence word by word by choosing each time the most plausible-sounding word.
In order for the chat to be more "natural" and in order to let the chatbot reformulate answers when asked, you typically let the AI make some choices at random every so often (you don't choose the most plausible word but the second most-plausible or something like that).
In the GPT models, you have two parameters (temperature and top_p) that you can let you control the randomness of the answers. Unfortunately, you can't control them in directly available models like ChatGPT or Copilot.
Romain The Low-Code...
25
Pablo Roldan
25
stampcoin
10