Microsoft’s AI language interpreter could be boon for cybercriminals


Full story

Microsoft unveiled a new Teams AI interpreter on Tuesday, Nov. 19. The program can replicate a user’s voice in near-real-time in nine different languages.

The company said it plans to expand to 31 different languages in the future. The languages reportedly range from English, French, German, Italian, Chinese, Korean, Japanese, Portuguese and Spanish.

The feature is currently available to a select group of users. Officials said that will expand to more customers with a Microsoft 365 Copilot license in 2025.

The company touts the new technology as a less expensive way to have international phone calls or meetings without the expense of a human translator.

However, the new feature isn’t quite perfect yet, with Microsoft admitting the Teams AI interpreter may not be 100% accurate.

Still, critics say that’s not their biggest concern about the AI feature. They said they’re afraid the technology could open the door to fraudsters.

Security analysts warn of potential hackers using the technology, noting deepfakes are already a problem and impersonation scams reportedly cost Americans more than $1 billion in 2023.

In 2024, scammers used deepfake technology to setup a fake Teams video conference call and stole $25 million from a multinational firm.

An anonymous threat analyst group is already skeptical of Microsoft’s new technology, posting on X, “Ever be North Korean but want to sound American? It’s now possible,” apparently poking fun at the Big Tech company.

However, the group’s concerns may not just be talk. A recent report by SecureWorks warned of North Korean hackers applying for IT jobs at companies across the United States, United Kingdom and Australia, in an attempt to steal company secrets.

Cybersecurity experts urge companies and organizations worried about impersonation scams to opt out of the new Microsoft feature. They said companies should use the generic voice simulator option instead.

Microsoft users will reportedly have to give consent to the AI interpreter through privacy settings for it to use voice simulation during a meeting.

They can opt out of the voice replication by disabling it in settings. The interpreter will then use a default interpretation of the person’s voice instead.

Tags: , , , , , ,

Media landscape

Click on bars to see headlines

58 total sources

Key points from the Center

No summary available because of a lack of coverage.

Report an issue with this summary

Key points from the Right

No summary available because of a lack of coverage.

Report an issue with this summary

Other (sources without bias rating):

Powered by Ground News™

Full story

Microsoft unveiled a new Teams AI interpreter on Tuesday, Nov. 19. The program can replicate a user’s voice in near-real-time in nine different languages.

The company said it plans to expand to 31 different languages in the future. The languages reportedly range from English, French, German, Italian, Chinese, Korean, Japanese, Portuguese and Spanish.

The feature is currently available to a select group of users. Officials said that will expand to more customers with a Microsoft 365 Copilot license in 2025.

The company touts the new technology as a less expensive way to have international phone calls or meetings without the expense of a human translator.

However, the new feature isn’t quite perfect yet, with Microsoft admitting the Teams AI interpreter may not be 100% accurate.

Still, critics say that’s not their biggest concern about the AI feature. They said they’re afraid the technology could open the door to fraudsters.

Security analysts warn of potential hackers using the technology, noting deepfakes are already a problem and impersonation scams reportedly cost Americans more than $1 billion in 2023.

In 2024, scammers used deepfake technology to setup a fake Teams video conference call and stole $25 million from a multinational firm.

An anonymous threat analyst group is already skeptical of Microsoft’s new technology, posting on X, “Ever be North Korean but want to sound American? It’s now possible,” apparently poking fun at the Big Tech company.

However, the group’s concerns may not just be talk. A recent report by SecureWorks warned of North Korean hackers applying for IT jobs at companies across the United States, United Kingdom and Australia, in an attempt to steal company secrets.

Cybersecurity experts urge companies and organizations worried about impersonation scams to opt out of the new Microsoft feature. They said companies should use the generic voice simulator option instead.

Microsoft users will reportedly have to give consent to the AI interpreter through privacy settings for it to use voice simulation during a meeting.

They can opt out of the voice replication by disabling it in settings. The interpreter will then use a default interpretation of the person’s voice instead.

Tags: , , , , , ,

Media landscape

Click on bars to see headlines

58 total sources

Key points from the Center

No summary available because of a lack of coverage.

Report an issue with this summary

Key points from the Right

No summary available because of a lack of coverage.

Report an issue with this summary

Other (sources without bias rating):

Powered by Ground News™