Context Window

The Context Window (or simply Context, also referred to as Context Size or Context Length) refers to the number of words or tokens that a LLM considers when predicting the next word. It allows the model to “look back” at previous text to influence the current prediction.

Key Concepts

Implementation Details

In the context of creating Input-Target Pairs, the Context Size determines how many tokens are included in the input for the model.

    Mike 3.0

    Send a message to start the chat!

    You can ask the bot anything about me and it will help to find the relevant information!

    Try asking: