Chain-of-Thought Prompting

CoT

A prompting technique that encourages a language model to break down complex reasoning tasks into intermediate steps, improving accuracy on logical and mathematical problems.

In Plain Language

Asking an AI to "think step by step" before giving its final answer. This often dramatically improves accuracy on math and logic problems; like asking someone to show their work on an exam.