Example

Application of CoT to Dyck Languages

Dyck languages involve formal tasks such as completing sequences of nested brackets or parentheses to ensure syntactic validity. Chain-of-Thought (CoT) prompting assists language models in these symbolic reasoning tasks by encouraging step-by-step state tracking. For example, given the input sequence [ {, the model uses CoT to explicitly track the stack's state at each operation (e.g., 0: empty stack, 1: [ ; stack: [, 2: { ; stack: [ {) to systematically determine that the correct closing sequence is } ].

0

1

Updated 2026-04-30

Contributors are:

Who are from:

Tags

Foundations of Large Language Models

Ch.3 Prompting - Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences