sport

Deepfake of principal's voice is the latest case of AI being used for harm

Font size+Author:Global Grooves news portalSource:entertainment2024-05-21 17:07:46I want to comment(0)

The most recent criminal case involving artificial intelligence emerged last week from a Maryland hi

The most recent criminal case involving artificial intelligence emerged last week from a Maryland high school, where police say a principal was framed as racist by a fake recording of his voice.

The case is yet another reason why everyone — not just politicians and celebrities — should be concerned about this increasingly powerful deep-fake technology, experts say.

“Everybody is vulnerable to attack, and anyone can do the attacking,” said Hany Farid, a professor at the University of California, Berkeley, who focuses on digital forensics and misinformation.

Here’s what to know about some of the latest uses of AI to cause harm:

AI HAS BECOME VERY ACCESSIBLE

Manipulating recorded sounds and images isn’t new. But the ease with which someone can alter information is a recent phenomenon. So is the ability for it to spread quickly on social media.

The fake audio clip that impersonated the principal is an example of a subset of artificial intelligence known as generative AI. It can create hyper-realistic new images, videos and audio clips. It’s cheaper and easier to use in recent years, lowering the barrier to anyone with an internet connection.

Related articles
  • Supreme Court declines to hear challenge to Maryland ban on rifles known as assault weapons

    Supreme Court declines to hear challenge to Maryland ban on rifles known as assault weapons

    2024-05-21 16:11

  • For Boston Marathon's last 100 years, it all starts in Hopkinton

    For Boston Marathon's last 100 years, it all starts in Hopkinton

    2024-05-21 16:02

  • Feature: China

    Feature: China

    2024-05-21 15:46

  • Active role of overseas Chinese hailed at event

    Active role of overseas Chinese hailed at event

    2024-05-21 14:58

Netizen comments