Whatever your need as a hacker post-compromise, Microsoft Copilot has got you covered.
- Covertly search for sensitive data and parse it nicely for your use
- Exfiltrate it out without generating logs
- Most frightening, Microsoft Copilot will help you phish to move lately
Heck, it will even social engineer victims for you!
This talk is a comprehensive analysis of Microsoft copilot taken to red-team-level practicality. We will show how Copilot plugins can be used to install a backdoor into other user’s copilot interactions, allowing for data theft as a starter and AI-based social engineering as the main course.
We’ll show how hackers can circumvent built-in security controls which focus on files and data by using AI against them.
Next, we will drop LOLCopilot, a red-teaming tool for abusing Microsoft Copilot as an ethical hacker to do all of the above. The tool works with default configuration in any M365 copilot-enabled tenant. Finally, we will recommend detection and hardening you can put in place to protect against malicious insiders and threat actors with Copilot access.
- Michael Bargury | CTO, Zenity
- Tamir Ishay Sharbat | Software Engineer, AI Security, Zenity
- Gal Malka | Software Engineering Manager, Zenity
- Lana Salameh | Software Engineering Manager, Zenity
Full Abstract & Presentation Materials: https://labs.zenity.io/p/links-materials-living-off-microsoft-copilot