LEXICON Open App →
HomeGlossary › tokenization

tokenization

🧠 CT-GenAI

Official ISTQB Definition

The process of breaking down text into smaller units called tokens for efficient processing by language models.

3 Ways to Think About It

💡
The Quick Take

How LLMs break text into chunks for processing - affects costs, context limits, and test prompt design.

🔍
Look Closer

The process of splitting your prompts into pieces the AI can understand, impacting how much you can ask.

🎯
The Bottom Line

Understanding tokens helps testers optimize prompts and estimate API costs when testing LLM features.

Practice this term with quizzes and arcade games

Study with Lexicon →
← time behaviorAll TermsTotal Quality Management →