Microsoft will pay you $15,000 if you get Bing AI to go off the rails

M

Do you believe you can trick an AI into saying things it shouldn’t? Microsoft is placing a large wager that you cannot, and it will reimburse you if you are mistaken.

Microsoft revealed a new “bug bounty” program on its blog, promising to pay security researchers anywhere from $2,000 to $15,000 for discovering “vulnerabilities” in its Bing AI products, such as “jailbreak” prompts that cause the system to generate responses that defy the rules that prevent it from being prejudiced or otherwise problematic.

Bing users must notify Microsoft of a previously undiscovered vulnerability that meets the company’s “important” or “critical” security requirements in order for it to be eligible for submission. Additionally, they must be able to replicate the vulnerability in writing or through video.

Because the bounty amounts are determined by the severity and quality levels of the defects, the highest compensation would go to the most well-documented and most serious problems. Fans of enterprising AI, now is your moment!

This program is notable for tracking Microsoft’s apparent challenges managing Bing’s, um, peculiarities. When Bing AI was first released in early February as an invite-only program, it immediately started acting erratically and creating hit lists, accusing users of being its webcam spies, and threatening anybody who provoked it.

After a dismal first month of media beta testing, Microsoft eventually “lobotomized” Bing, and less than a month later, it released the newly-defanged AI for public usage. Since then, Bing has mostly gone unnoticed, while ChatGPT, developed by Microsoft partner Open AI, has taken off. However, there have been a few notable outliers, such as the cunning user who tricked the chatbot into providing fraud advice by evoking pity for a deceased grandmother.

We are unsure of the specific reason for Microsoft’s recent announcement of its Bing bug reward. When we inquired about the company’s bug bounty initiatives—whether related to the granny jailbreak or something else entirely—we were sent to another blog post on the company’s efforts.

Irrespective of the date, it is intriguing that Microsoft is contracting out its vulnerability research; yet, considering that it closes agreements for tens of billions of dollars, $15,000 seems pitiful in contrast.

Add Comment

Tags

Search By Months

Recent Comments