Example Prompt Instruction for Faithfulness and Abstention
This prompt instruction exemplifies a technique to enhance an LLM's reliability. It explicitly commands the model to prioritize factual accuracy and faithfulness to the provided context. Furthermore, it provides a clear directive for the model to abstain from answering by outputting 'No answer!' when the given information is insufficient to formulate an accurate response. The instruction is as follows:
Please note that your answers need to be as accurate as possible and faithful to the facts. If the information provided is insufficient for an accurate response, you may simply output "No answer!".
0
1
Tags
Ch.3 Prompting - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Related
Example Prompt Instruction for Faithfulness and Abstention
Evaluating a Prompting Strategy for Factual Accuracy
A developer is building a question-answering system and provides a language model with the following instruction: 'You must base your answer strictly on the provided text. If the text does not contain enough information to answer the question accurately, you must respond with the exact phrase
Insufficient Information.' The model is then given the question 'What is the capital of Australia?' and the following text: 'Australia is a country and continent surrounded by the Indian and Pacific oceans. Its major cities are Sydney, Brisbane, Melbourne, and Perth.' Based on the developer's instruction, what is the most appropriate response from the model?Designing a Prompt for Factual Summarization
Example Prompt Instruction for Faithfulness and Abstention
A developer is constructing a message to send to a large language model to answer a user's question based on a retrieved document. The developer has the following three components:
- The Instruction:
You are a helpful assistant. Answer the user's question based *only* on the provided text. If the information is not in the text, state that the answer cannot be found. - The User's Question:
What is the boiling point of ethanol? - The Retrieved Context:
Ethanol, a volatile, flammable, colorless liquid, has a boiling point of 78.37 °C.
Which of the following arrangements of these components creates the most effective and logically structured message for the model to process and follow the instructions correctly?
- The Instruction:
Diagnosing a Faulty Q&A System Prompt
Task Definition and Grounding in RAG Prompts
Constructing a RAG Prompt
Learn After
A language model is configured with the instruction below. It is then given a user's question and a specific piece of context to generate an answer.
Instruction:
Please note that your answers need to be as accurate as possible and faithful to the facts. If the information provided is insufficient for an accurate response, you may simply output "No answer!".User Question: "What was the primary cause of the 2008 financial crisis?"
Provided Context: "The 2008 financial crisis was a severe worldwide economic crisis. Major financial institutions collapsed, stock markets plunged, and millions lost their jobs. The crisis led to a global recession, the effects of which were felt for many years. Governments around the world implemented large-scale bailout programs and new financial regulations in response."
Based on the provided instruction and context, what is the most likely output from the language model?
Improving AI Assistant Reliability
Evaluating a Prompt Instruction for LLM Reliability