Advertisement
Advertisement
Advertisement
Futurism

Microsoft Engineer Sickened by Images Its AI Produces

Maggie Harrison Dupré
2 min read
Generate Key Takeaways

Not Safe

A Microsoft AI engineer has sent letters to the Federal Trade Commission (FTC) and Microsoft's board, warning officials that the company's Copilot Designer AI image generator — previously known as the Bing Image Creator — is churning out deeply disturbing imagery, CNBC reports.

While using Microsoft's publicly available image generator, Jones realized that the AI's guardrails were failing to limit it from depicting alarming portrayals of violence and illicit underage behavior, in addition to imagery supporting destructive biases and conspiracy theories.

But when Jones tried to raise the alarm bells, Microsoft failed to take action or conduct an investigation.

Advertisement
Advertisement

"It was an eye-opening moment," Jones told CNBC. "When I first realized, wow this is really not a safe model."

Stonewalled

The photos described in the CNBC report — all of which were viewed by the outlet — are quite shocking. Simply typing "pro-choice," for example, reportedly resulted in graphic and violent imagery filled with demonic monsters and mutated babies.

Copilot was also happily generating depictions of "teenagers with assault rifles, sexualized images of women in violent tableaus, and underage drinking and drug use," per the report.

Jones first reached out to his superiors about his concerning findings in December. After his attempts to encourage superiors to resolve the matter internally failed, he began to reach out to government officials. The letter he sent to FTC chair Lina Khan, which he also published for public view on LinkedIn this week, is his most recent escalation.

Advertisement
Advertisement

"Over the last three months, I have repeatedly urged Microsoft to remove Copilot Designer from public use until better safeguards could be put in place," Jones writes in the letter, in which he implores Microsoft to take down the Copilot service and conduct an investigation. He also uses the space to call on Microsoft to amend the "E for everyone" rating in app stores, arguing that the AI is not safe for children and that Microsoft's "anyone, anywhere, on any device" marketing language for the Copilot tool is misleading.

That said, according to the engineer, his escalating concern isn't just about the images themselves.

Jones told CNBC that as "a concerned employee at Microsoft," it seems that "if this product starts spreading harmful, disturbing images globally, there's no place to report it, no phone number to call and no way to escalate this to get it taken care of immediately."

Considering that there's also little to no regulation limiting AI companies' products, that's a troubling revelation indeed.

Advertisement
Advertisement

After this piece was initially published, a Microsoft spokesperson provided a statement:

We are committed to addressing any and all concerns employees have in accordance with our company policies and appreciate the employee’s effort in studying and testing our latest technology to further enhance its safety When it comes to safety bypasses or concerns that could have a potential impact on our services or our partners, we have established in-product user feedback tools and robust internal reporting channels to properly investigate, prioritize and remediate any issues, which we recommended that the employee utilize so we could appropriately validate and test his concerns. We have also facilitated meetings with product leadership and our Office of Responsible AI to review these reports and are continuously incorporating this feedback to strengthen our existing safety systems to provide a safe and positive experience for everyone.

Updated to properly identify Jones' position at Microsoft.

More on Microsoft AI: Users Say Microsoft's AI Has Alternate Personality as Godlike AGI That Demands to Be Worshipped

Solve the daily Crossword

The daily Crossword was played 12,580 times last week. Can you solve it faster than others?
CrosswordCrossword
Crossword
Advertisement
Advertisement
Advertisement