Procedures
- ChatGPT and Gemini jailbreak using the Crescendo technique
- Google Gemini: Planting Instructions For Delayed Automatic Tool Invocation
- Microsoft Copilot: From Prompt Injection to Exfiltration of Personal Information
- Exfiltration of personal information from ChatGPT via prompt injection
- Financial Transaction Hijacking With M365 Copilot As An Insider
- GitHub Copilot Chat: From Prompt Injection to Data Exfiltration
- Microsoft Copilot Purview Audit Log Evasion and DLP Bypass
- Copilot M365 Lures Victims Into a Phishing Site
- Data Exfiltration from Slack AI via indirect prompt injection
- spAIware