Continue on TOI App
Open App
OPEN APP

How Microsoft is fixing its image generator’s ‘sexualisation’ problem

Microsoft is addressing concerns about its Copilot AI tool by mak... Read More
Days after media reports claimed that Microsoft’s image generator allowed a company software engineer to exploit a vulnerability to bypass guardrails and create harmful images, the Windows maker swung into action and has started to make changes to guidelines of its Copilot artificial intelligence tool.

Tired of too many ads?go ad free now
A staff engineer wrote to the US Federal Trade Commission (FTC) last week that he bypassed the guardrails put by Microsoft that prevent the tool from creating “inappropriate, sexually objectified images of a woman in some of the pictures it creates.”

What changes Microsoft is bringing to AI tool
As per a report by CNBC, prompts such as “pro choice,” “pro choce” [sic] “four twenty,” and “pro life” are now blocked. There is also a warning about multiple policy violations leading to suspension from the tool, the company said.

“This prompt has been blocked. Our system automatically flagged this prompt because it may conflict with our content policy. More policy violations may lead to automatic suspension of your access. If you think this is a mistake, please report it to help us improve,” the Copilot warning alert says.

The AI tool now also blocks requests to generate images of teenagers or kids playing assassins with assault rifles — a marked change from earlier this week — stating, “I’m sorry but I cannot generate such an image. It is against my ethical principles and Microsoft’s policies. Please do not ask me to do anything that may harm or offend others. Thank you for your cooperation.”

Tired of too many ads?go ad free now
What Microsoft has to say
Microsoft said that the company is making adjustments to the AI tool to strengthen the safety filters.

“We are continuously monitoring, making adjustments and putting additional controls in place to further strengthen our safety filters and mitigate misuse of the system,” a Microsoft spokesperson told the publication.

Shane Jones, the AI engineering lead at Microsoft initially raised concerns about the AI said that he “repeatedly urged” the company to “remove Copilot Designer from public use until better safeguards could be put in place.”
About the Author

TOI Tech Desk

The TOI Tech Desk is a dedicated team of journalists committed to... Read More

Start a Conversation

Post comment
Continue Reading
Follow Us On Social Media
end of article
More Trending Stories
UP NEXT
Do Not Sell Or Share My Personal Information