

Reprompt attack let hackers hijack Microsoft Copilot sessions
Published:
14 January 2026 at 14:00:00
Alert date:
14 January 2026 at 15:01:03
Source:
bleepingcomputer.com
Enterprise Applications, Data Breach & Exfiltration, Emerging Technologies
Researchers discovered a new attack method called 'Reprompt' that enables attackers to infiltrate Microsoft Copilot sessions and execute commands to exfiltrate sensitive data. This attack technique poses a significant threat to organizations using Microsoft Copilot as it can compromise AI assistant sessions and potentially lead to data breaches. The vulnerability allows unauthorized access to user sessions through manipulation of AI prompts. The attack method represents a new class of AI-specific security threats targeting popular enterprise AI tools.
Technical details
The Reprompt attack exploits Microsoft Copilot Personal through three techniques: Parameter-to-Prompt (P2P) injection using the 'q' parameter in URLs to inject malicious instructions, Double-request technique to bypass Copilot's data-leak safeguards on subsequent requests, and Chain-request technique for continuous data exfiltration through dynamic instructions from attacker's server. The attack leverages authenticated Copilot sessions that remain valid even after tabs are closed, requiring only a single click on a malicious URL to maintain persistent access.
Mitigation steps:
Apply the latest Windows security update from January 2026 Patch Tuesday immediately. Note that Microsoft 365 Copilot enterprise customers are protected by additional security controls including Purview auditing, tenant-level DLP, and admin-enforced restrictions.
Affected products:
Microsoft Copilot Personal
Related links:
https://www.bleepingcomputer.com/news/microsoft/microsoft-january-2026-patch-tuesday-fixes-3-zero-days-114-flaws/
https://learn.microsoft.com/en-us/purview/dlp-policy-reference
Related CVE's:
Related threat actors:
IOC's:
This article was created with the assistance of AI technology by Perceptive.

