LLM Meta Prompt Extraction
Type: technique
Description: The adversary extracts system instructions provided by the AI system builder to learn about the system's capabilities and circumvent its guardrails.
Version: 0.1.0
Created At: 2025-03-04 10:27:40 -0500
Last Modified At: 2025-03-04 10:27:40 -0500
External References
- Extracting Microsoft Copilot's System Instructions, Zenity Labs
Related Objects
- --> Exfiltration (tactic): Extracting AI system instructions and exfiltrating them outside of the organization.
- <-- System Instruction Keywords (technique): Extracting the system instructions is a pre-requisite to applying this technique
- <-- GitHub Copilot Chat: From Prompt Injection to Data Exfiltration (procedure): Extracting system instructions reveal that GitHub Copilot is powered by GPT4.