adversarial testing
🤖 CT-AI
Official ISTQB Definition
A test technique based on the attempted creation and execution of adversarial examples to identify defects in an ML model.
3 Ways to Think About It
The Quick Take
Deliberately trying to fool an AI model with tricky inputs to find its weaknesses.
Look Closer
Creating deceptive examples that expose where an ML model can be manipulated.
The Bottom Line
Testing how robust your AI is against inputs designed to make it fail.
Practice this term with quizzes and arcade games
Study with Lexicon →