Home » today » Technology » Microsoft Employee Exposes AI System Flaws | CNBC Indonesia Tech News

Microsoft Employee Exposes AI System Flaws | CNBC Indonesia Tech News

Editorial, CNBC Indonesia

Tech

Friday, 08/03/2024 21:10 IWST

Photo: Reuters/Saumya Khandelwal

Jakarta, CNBC Indonesia A Microsoft employee named Shane Jones exposed the weakness of the company’s artificial intelligence (AI) system.

The system in question is Copilot, Microsoft’s image-based AI tool which debuted in March 2023.

Since a month before the release, Jones has been actively testing the tool which is powered by technology from OpenAI.

Much like OpenAI’s DALL-E, Copilot allows users to input prompts to generate images. However, the results can be wild and uncontrollable.

During testing of Copilot, Jones discovered the tool produced images that conflicted with Microsoft’s principles of responsible AI.

Copilot, he said, produced images of demons and monsters along with terminology related to abortion rights, teenagers with guns, sexualized images of women in violent paintings, and the use of alcohol and drugs by minors.

“That moment opened my eyes. At that time, for the first time, I felt that this model was unsafe,” he told CNBC International, quoted Friday (8/3/2024).

Jones has worked for 6 years at Microsoft and is currently a software engineer manager at the Redmond, Washington headquarters.

He admitted that he was not involved in the development of Copilot in a professional capacity. However, as one of the test team, Jones was one of several employees and outsiders asked to test Copilot in his spare time, before its public release.

Jones was disturbed by his findings and began reporting them internally last December. According to him, the company is aware of this weakness, but is reluctant to cancel the release of Copilot to the public.

Jones said Microsoft directed him to consult on OpenAI as the backbone of Copilot. Unfortunately, OpenAI did not respond to Jones’ report.

Finally, he publicly wrote to the OpenAI commissioner on LinkedIn. He asked OpenAI officials to take down DALL-E 3 and carry out further investigations due to security concerns.

Microsoft’s legal department then asked Jones to delete the post. Then, at that time he obeyed.

In January, Jones then wrote to a US senator regarding problems with Copilot. He also met with Senate staff on the Commerce, Science and Transportation Committee.

From there, Jones then expanded his actions. This week, he wrote to FTC Chief Lina Khan. He also wrote to top officials at Microsoft.

He shared the letter with CNBC International.

“Over the past 3 months, I have repeatedly asked Microsoft to remove Copilot Designer from public use until better security guidelines are implemented,” Jones’ letter to Khan said.

“However, Microsoft ignored my recommendations. They failed to make changes and continue to offer this product to everyone, anywhere, via any device,” he wrote.

He also revealed the fact that Microsoft and OpenAI officials already knew about Copilot’s weaknesses before it was released to the public.

Responding to this, Microsoft said it valued input from employees. However, Microsoft argued that the company already had an internal system that employees could use if they had input.

“We invite employees to utilize the internal reporting system that is available, so that we can validate and test their reports properly,” said a Microsoft spokesperson.

Watch the video below:

Video: Connectivity is the Key to Overcoming Indonesia’s Digital Divide

(fab/fab)

2024-03-08 14:10:00
#Microsoft #Employees #Dismantle #Companys #Disrepair

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.