Few-shot Learning involves the model learning from a minimum number of examples (a few examples) provided in the Context Window. The model sees a few demonstrations of the task to understand the pattern before performing it on a new input.
Example
- Prompt:
- “Sea otter -> loutre de mer”
- “Peppermint -> menthe poivrée”
- “Giraffe -> girafe”
- “Translate cheese”
- Output: The model uses these supporting examples to translate “cheese” correctly.
GPT-3 was notably described as a “Few-Shot Learner” in its research paper. Ref: Language Models are Few-Shot Learners
Related
- Zero-shot Learning: No examples provided.
- One-shot Learning: Only one example provided.
