• Home
  • Chemistry
  • Astronomy
  • Energy
  • Nature
  • Biology
  • Physics
  • Electronics
  • Speech Deepfakes: Study Reveals Persistent Deception Even After Detection Training
    Study Shows Speech Deepfakes Frequently Fool People, Even After Training on How to Detect Them

    A new study from the University of California, Berkeley, has found that speech deepfakes can frequently fool people, even after they have been trained on how to detect them.

    The study, which was published in the journal _PLoS One_, asked 33 people to listen to a series of speech deepfakes and real recordings. The participants were then asked to rate how confident they were that each recording was real or fake.

    The results showed that the participants were fooled by the speech deepfakes about 50% of the time, even after they had been trained on how to detect them. This suggests that speech deepfakes are becoming increasingly sophisticated and difficult to detect.

    The study's authors say that their findings have implications for the use of speech deepfakes in the real world. They warn that speech deepfakes could be used to spread misinformation, impersonate others, or commit fraud.

    "Our findings suggest that we need to be more cautious about believing what we hear," said study co-author Dan Goldstein. "Speech deepfakes are becoming more and more realistic, and they are getting harder to detect."

    The study's authors recommend that people take the following steps to protect themselves from speech deepfakes:

    * Be aware that speech deepfakes exist and be skeptical of any audio recordings that you come across.

    * Pay attention to the visual cues in a recording. If the person's mouth doesn't move in sync with the words they are saying, it may be a deepfake.

    * Listen for any strange or unnatural sounds in a recording. Deepfakes often have an electronic or robotic sound.

    * If you're not sure whether a recording is real or fake, try to find other sources of information to verify it.

    The study's authors say that these steps can help people protect themselves from speech deepfakes, but they also acknowledge that speech deepfakes are a serious problem that will require a concerted effort to address.

    "We need to continue to research speech deepfakes and develop new ways to detect them," said Goldstein. "We also need to raise awareness of the dangers of speech deepfakes and educate people about how to protect themselves."

    Science Discoveries © www.scienceaq.com