2 Ocelot

Action items

Identify which questions should be removed from the Registrar library.
Identify which Registrar library responses need to be udpated.
Identify questions and develop responses to questions not in the Registrar library.
Plan content development management with Steph D.
Add questions related to consortium and enrollment confirmation.

2024-07-29

Colette to email Catie at Ocelot to better understand the hierarchy guiding the source of answers (in the library vs. on our website vs. AI). This will inform our next step.

Colette will ask Shawn to give access to the chatbot to our interns so they can identify holes in content.

We are aiming for a launch in the second week in October. This was timed based on spring/summer registration

2024-07-31

From Katie at Ocelot

When a student asks a question, our virtual assistant follows these steps to prioritize its response:

  1. Primary Knowledge Base Answer:

    • The assistant first searches its knowledge base for a primary answer.

    • If found, this answer is given to the student and the interaction is logged as a “Knowledge base” response.

  2. Auto Content Generation (ACG):

    • If no primary answer is found and ACG is enabled, the assistant attempts to generate an answer.

    • If ACG produces an answer, it is provided to the student and logged as “Generative AI”.

    • If ACG is blocked by guardrails, a default respectful message is provided and logged as “Declined”.

  3. Suggestions:

    • If neither a primary answer nor an ACG answer is available, the assistant searches for suggestions.

    • These can include knowledge base suggestions, web links, and videos.

    • If suggestions are found, they are provided and the interaction is logged based on the type of suggestions (either “Search” or “Suggestion”).

  4. Fallback:

    • If all methods fail to provide an answer or suggestions, the assistant may respond with an “I don’t know” message.

The Generative AI works by using natural language processing to understand the intent and then gather relevant info from your spidered data. It will scrape all of the data to return what the AI believes to be the most relevant response. If the student asked the same question again after, it would create a new response because essentially that is telling the AI to generate a new answer. Also wanted to mention we will soon have the ability to rank the order of spiders that the AI pulls from.

In terms of testing, if it is easier to log questions in a spreadsheet you definitely could. Or it might be easier to save them as a draft and review later (they will appear here).  Usually clients have a few content experts from each department who will focus on testing that department's questions.