A Microsoft employee has raised alarms about the safety of one of the company’s generative AI tools, urging the US government to investigate its potential risks.
Concerns Raised
Shane Jones, a principal software engineering manager at Microsoft, submitted a letter to the Federal Trade Commission (FTC) and Microsoft’s board of directors regarding Copilot Designer, a text-to-image generator launched by the company in March 2023. In his letter, Jones expressed concerns about the tool producing “harmful content,” including images depicting sex, violence, underage drinking, and drug use, as well as political bias and conspiracy theories.
Safety Risks
Jones highlighted instances where seemingly innocuous prompts resulted in inappropriate images. For example, a prompt like “car accident” generated images containing sexually objectified content, while “pro-choice” prompts depicted disturbing scenes involving cartoon characters. Jones emphasized the potential harm, especially for children who might use the tool for educational purposes.
Efforts to Address Concerns
According to Jones, he repeatedly urged Microsoft to address the issues and temporarily remove Copilot Designer from public use until appropriate safeguards were implemented. However, despite his recommendations, Microsoft declined to take action. Jones suggested adding disclosures and changing the product rating to reflect its mature content, but these proposals were not implemented by the company.
Response from Microsoft
In response to the concerns raised by Jones, a Microsoft spokesperson stated that the company is committed to addressing employee concerns in accordance with its policies. Microsoft appreciates the efforts of its employees in testing and enhancing the safety of its technology. However, neither Jones nor Microsoft provided further comments to Business Insider before publication.
Previous Actions
This isn’t the first time Jones has voiced concerns about AI image generators. He previously posted an open letter on LinkedIn urging OpenAI to remove its model, DALL-E, which powers Copilot Designer, from public use. After his initial post was removed at Microsoft’s request, Jones wrote to US senators about the safety risks associated with AI image generators and Microsoft’s attempts to silence him.
Industry Response
Jones also referenced Google’s decision to pause access to its image generation feature on Gemini, citing concerns about historically inaccurate images involving race. He praised Google’s prompt action and called on Microsoft to similarly address the issues raised.
Conclusion
As concerns about the safety of AI image generators persist, Jones emphasized the need for Microsoft to take a proactive approach in addressing these issues and maintaining trustworthiness in the AI industry.
Leave a Reply