Drive-By Compromise

Type: technique

Description: Adversaries may gain access to an AI system through a user visiting a website over the normal course of browsing, or an AI agent retrieving information from the web on behalf of a user. Websites can contain an LLM Prompt Injection which, when executed, can change the behavior of the AI model.

Version: 0.1.0

Created At: 2025-07-23 10:23:39 -0400

Last Modified At: 2025-07-23 10:23:39 -0400


External References

  • --> Initial Access (tactic): An adversary can access the AI system by taking advantage of a user visiting a compromised website.