Publish Hallucinated Entities
Type: technique
Description: Adversaries may create an entity they control, such as a software package, website, or email address to a source hallucinated by an LLM. Unsuspecting users would be pointed to entities under the control of the attackers when interacting with a hallucinated entity.
Version: 0.1.0
Created At: 2024-12-31 14:18:56 -0500
Last Modified At: 2024-12-31 14:18:56 -0500
External References
- Diving Deeper into AI Package Hallucinations., Lasso Security
- Can you trust ChatGPT’s package recommendations?, Vulcan
Related Objects
- --> Resource Development (tactic): An adversary can create rouge entities that correspond to entities hallucinated by an LLM.
- --> Discover LLM Hallucinations (technique): The adversary needs to discover entities commonly hallucinated by the LLM, in order to create the corresponding entities.
- <-- Discover LLM Hallucinations (technique): An adversary may take advantage of hallucinated entities to point victims to rouge entities created by the adversary.