A company develops an AI chatbot to answer employee questions about its internal HR policies, which are stored in a document database. Employees report that the chatbot sometimes provides answers based on outdated policies, even though the document database is updated daily. Which of the following is the most likely reason for this issue?
0
1
Tags
Ch.5 Inference - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Analysis in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
A company develops an AI chatbot to answer employee questions about its internal HR policies, which are stored in a document database. Employees report that the chatbot sometimes provides answers based on outdated policies, even though the document database is updated daily. Which of the following is the most likely reason for this issue?
Evaluating Response Grounding in a RAG System
The Role of Grounding in RAG Systems
Youâre on-call for an internal engineering assista...
You are reviewing two proposed designs for an inte...
Your team is building an internal âRelease Notes Q...
Youâre designing an internal LLM assistant for a c...
Design Review: Choosing Between RAG and k-NN LM for a Regulated Support Assistant
Post-Incident Analysis: Why a RAG Assistant Hallucinated Despite âHaving the Docsâ
Architecture Decision Memo: Unifying Vector-DB RAG and k-NN LM for a Global Policy Assistant
Case Study: Root-Cause Analysis of âCorrect Source, Wrong Answerâ in a RAG + k-NN LM Assistant
Case Study: Debugging a RAG Assistant with a Vector DB and a k-NN LM Memory
Case Review: Diagnosing Conflicting Answers in a Hybrid Retrieval System