Context Window

LLM Context Window

Definition

The maximum amount of text (measured in tokens) an LLM can process in a single call — including both input (system prompt + history + tools) and output. Larger context windows allow agents to reason over more information at once, but cost more and can reduce quality at the edges.

Examples in the Wild

  • Example 1:Claude Sonnet 4: 200k tokens (~150k words)
  • Example 2:GPT-4o: 128k tokens