Microsoft's Software Engineer Reveals Copilot AI Tool Generated Disturbing AI Images During Testing

By Thea Felicity

Mar 06, 2024 11:40 AM EST

(Photo : Photo by LIONEL BONAVENTURE/AFP via Getty Images)
This picture taken on January 23, 2023 in Toulouse, southwestern France, shows screens displaying the logos of Microsoft and OpenAI. - Microsoft extended on January 23 its partnership with with OpenAI, the research lab and creator of ChatGPT, a conversational artificial intelligence application software, in a "multiyear, multibillion dollar investment".

An artificial intelligence engineer at Microsoft was shocked by the disturbing images he encountered while testing Copilot Designer, an AI image generator developed by Microsoft. 

Shane Jones, who has been with Microsoft for six years, is a principal software engineering manager at the company's headquarters in Redmond, Washington. 

In his free time, Jones red-team the product for vulnerabilities. 

His discoveries were alarming-Copilot Designer was generating images depicting demons, violent scenes, sexualized content, and other inappropriate imagery that violated Microsoft's responsible AI principles.

In an interview with CNBC, Jones revealed he immediately began reporting his findings internally, urging the company to take Copilot Designer off the market until better safeguards could be implemented. 

Unfortunately, Microsoft's response fell short of Jones's expectations. 

Instead of effectively addressing the issue, the company referred him to OpenAI, the technology's provider, and took no further action. 

Frustrated by the lack of progress, Jones posted an open letter on LinkedIn and contacted U.S. senators and the Federal Trade Commission to raise awareness about the potential risks posed by Copilot Designer.

READ NEXT: Elon Musk Mocks Canada's Justin Trudeau by Sharing Real Images of Him in 'Brownface,' Various Costumes, Then Credited Them to AI Tool

In January, Jones escalated his concerns by writing to Federal Trade Commission Chair Lina Khan and Microsoft's board of directors, urging them to take immediate action. 

In his letter to Microsoft's board, Jones requested an investigation into certain decisions made by the legal department and management and an independent review of Microsoft's responsible AI incident reporting processes. 

He emphasized the urgent need for disclosures to be added to the product and for its rating on Google's Android app to be adjusted to reflect its mature content. Jones stressed that Microsoft and OpenAI had been aware of the risks associated with Copilot Designer before its public release.

The public scrutiny intensified following Google's decision to temporarily suspend its AI image generator due to user complaints, as reported by VCPost. 

Microsoft has yet to comment on this issue.

READ MORE: Google Halts Gemini AI Image Generation Tool After 'Inaccuracies' in Historical Pictures

© 2024 VCPOST, All rights reserved. Do not reproduce without permission.

Join the Conversation

Real Time Analytics