Microsoft is offering up to $5,000 to developers who uncover moderate-severity vulnerabilities in its Copilot AI chatbot, a move that expands its security bounty program. Developers will now receive rewards for identifying flaws that enable exploitations like inference manipulation, code injection, and improper access control.
The Copilot Bug Bounty program was introduced in October 2023 to cover Bing’s AI features, which was expanded in April 2024 to include Copilot. The program invites security researchers to identify vulnerabilities in Microsoft’s AI products in return for payouts of up to $30,000, depending on their criticality. Before Feb. 7, 2025, moderate-severity vulnerabilities did not qualify for a payout.
“We recognize that even moderate vulnerabilities can have significant implications for the security and reliability of our Copilot consumer products,” the Microsoft Bounty Team wrote in a blog post.
“Expanding our bounty program to include Copilot reflects our ongoing commitment to security across Microsoft products and services, and we encourage researchers to help us identify and mitigate vulnerabilities.”
Examples of moderate-severity flaws include the ability to infer whether specific records were part of a model’s training data or the proportion of data records in the training that belong to a sensitive class, according to Microsoft.
While the maximum award is $5,000, Microsoft may offer developers just $250 for some moderate-severity flaws if the quality of the submission is low. Low-severity vulnerabilities do not qualify for financial incentives.
In addition to the new rewards structure, Microsoft has expanded the Copilot Bug Bounty Program’s scope to include Copilot for WhatsApp, Copilot for Telegram, copilot.microsoft.com, and copilot.ai. It already accepted submissions for vulnerabilities found in Copilot for Edge, Copilot for iOS, Android, and desktop apps, and Bing Generative Search.
The company has also updated its severity classifications for vulnerabilities and renamed the program from the Microsoft AI Bounty Program to the Microsoft Copilot Bounty Program. It also has introduced new resources for aspiring AI professionals looking to take part in its Zero Day Quest hacking event, which provides tools, workshops, and access to Microsoft AI engineers.
“By investing in the growth and education of AI researchers, we aim to cultivate a community of skilled professionals who can contribute to the advancement of AI technology and uphold the highest standards of security and innovation,” the Microsoft Bounty Team wrote.
Microsoft may be particularly on edge about the security of its AI products following scrutiny over its controversial Recall feature, which was initially designed to help users retrieve past activities but raised privacy concerns due to its ability to automatically capture screenshots from their devices. The company received significant pushback for potential security risks and subsequently delayed its rollout, implementing additional privacy controls and allowing users to opt out more easily.