context window
🧠 CT-GenAI
Official ISTQB Definition
The amount of preceding text, measured in tokens, that an LLM can consider when generating responses.
3 Ways to Think About It
The Quick Take
The amount of text an LLM can 'remember' in a conversation - a key constraint when testing chatbots.
Look Closer
The AI's working memory limit - longer conversations may lose earlier context, affecting test scenarios.
The Bottom Line
How much information you can include in a prompt, critical for designing comprehensive test inputs.
Practice this term with quizzes and arcade games
Study with Lexicon →