Erode ML Model Integrity

Type: technique

Description: Adversaries may degrade the target model's performance with adversarial data inputs to erode confidence in the system over time. This can lead to the victim organization wasting time and money both attempting to fix the system and performing the tasks it was meant to automate by hand.

Version: 0.1.0

Created At: 2025-03-04 10:27:40 -0500

Last Modified At: 2025-03-04 10:27:40 -0500


External References

  • --> Impact (tactic): Compromising the integrity of machine learning models to produce incorrect or unreliable results.