Example of Prompting Failure: Inuktitut Translation
An example illustrating the limits of prompting involves asking a Large Language Model to translate words from Inuktitut to English. If the model's pre-training did not include Inuktitut data, it will have a poor understanding of the language, and prompting alone will not enable it to perform the translation task effectively.
0
1
Tags
Ch.3 Prompting - Foundations of Large Language Models
Foundations of Large Language Models
Computing Sciences
Foundations of Large Language Models Course
Related
Example of Prompting Failure: Inuktitut Translation
A biotech startup is using a large language model, pre-trained on a general corpus of web text, to analyze and summarize highly specialized research papers on a newly discovered protein family. Despite hiring expert prompt engineers who have tried hundreds of complex, detailed prompts, the model's summaries are frequently inaccurate and miss crucial details. What is the most likely reason for this failure, and what is the most appropriate next step?
Evaluating a Claim about Prompt Engineering
Legacy Code Documentation Failure
Learn After
An advanced AI system is designed to identify plant species from photographs. It was developed and trained using a comprehensive dataset of plants found exclusively in Europe. A botanist working in the Amazon rainforest submits a clear, high-quality photograph of a newly discovered local orchid species. The system incorrectly identifies it as a common European flower that shares a similar color. The botanist wants to improve the system so it can correctly identify this new orchid in the future. Which of the following strategies is most likely to be successful?
Specialized Chatbot Failure
Obscure Programming Language Task