opinions

Deepfake of principal's voice is the latest case of AI being used for harm

Font size+Author:Global Grooves news portalSource:entertainment2024-05-21 15:07:01I want to comment(0)

The most recent criminal case involving artificial intelligence emerged last week from a Maryland hi

The most recent criminal case involving artificial intelligence emerged last week from a Maryland high school, where police say a principal was framed as racist by a fake recording of his voice.

The case is yet another reason why everyone — not just politicians and celebrities — should be concerned about this increasingly powerful deep-fake technology, experts say.

“Everybody is vulnerable to attack, and anyone can do the attacking,” said Hany Farid, a professor at the University of California, Berkeley, who focuses on digital forensics and misinformation.

Here’s what to know about some of the latest uses of AI to cause harm:

AI HAS BECOME VERY ACCESSIBLE

Manipulating recorded sounds and images isn’t new. But the ease with which someone can alter information is a recent phenomenon. So is the ability for it to spread quickly on social media.

The fake audio clip that impersonated the principal is an example of a subset of artificial intelligence known as generative AI. It can create hyper-realistic new images, videos and audio clips. It’s cheaper and easier to use in recent years, lowering the barrier to anyone with an internet connection.

Related articles
  • Ricky Stenhouse punching Kyle Busch could lead to suspension

    Ricky Stenhouse punching Kyle Busch could lead to suspension

    2024-05-21 14:16

  • Jamelia reveals the REAL reason for her sudden exit from Hollyoaks

    Jamelia reveals the REAL reason for her sudden exit from Hollyoaks

    2024-05-21 14:04

  • China's low

    China's low

    2024-05-21 13:09

  • EDEN CONFIDENTIAL: Princess Martha Louise of Norway blasts 'lies' about her love guru fiance

    EDEN CONFIDENTIAL: Princess Martha Louise of Norway blasts 'lies' about her love guru fiance

    2024-05-21 12:39

Netizen comments