Hi community,
I've been testing generative answers (conversation boost) on a public website, and I've found the performance to be quite poor relative to how Bing and Bing Copilot perform (using the "site:" specifier as recommended here). It's a site with thousands of reports, and if I ask for reports on a specific topic it'll typically just give me just one. If I ask for a list, it'll say that it doesn't understand or that it can't provide a list and I should go search on the website (see examples below). Bing and Bing Copilot do great with the same queries, surfacing lots of relevant reports.
I'm testing this as a copilot tool for the website, and in that setting it would obviously be awkward to suggest that the user go to the website and search there (since they'll already be using the copilot on the site to do that very thing). An example of follow-up questions gone badly:
You said: Can you list other studies?
Bot said: I’m sorry, I’m not sure how to help with that. Can you try rephrasing?
You said: You responded with one study, I'd like a list of others, please.
Bot said: I’m sorry, I’m not sure how to help with that. Can you try rephrasing?
You said: List all studies available on adolescent health in india
Bot said: I’m sorry, but I don’t have access to the specific list of studies available on adolescent health in India. However, you can visit the xyz website at [citation] to explore a wide range of research studies and resources related to adolescent reproductive and sexual health in India.
I can obviously fall back to scraping the website myself, indexing the content in a vector database, and building a custom chatbot within the AI Studio. But this Copilot Studio seems like it's meant to meet this need, and I'd prefer something my clients' IT teams can maintain. Any suggestions would be welcome.
Thanks!
Chris