Audio deepfake scams: Criminals are using AI to sound like family and people are falling for it
Artificial intelligence (AI) is being used to recreate the sound of family members' voices in order to scam people out of thousands of euros.
UP NEXT
UP NEXT
-
Adorable kittens looking for new home enjoy day out at aquarium
Short Cuts
-
IN CASE YOU MISSED IT: Cuba Gooding Jr. settles civil rape case ahead of trial
Cover Video
-
Kosovo PM slams 'Belgrade's intrusion', welcomes elections once 'fascist militia' brought to justice
France 24
-
Prince Harry court case evidence explained
Sky News
-
Billboard blown loose by strong winds knocks down three women
Short Cuts
-
'Mass plague of fish' wash up in Ukraine
Sky News
-
US skies darkened by Canada wildfires
Sky News
-
Kate shows of sporting skills in game of walking rugby
PA Media
-
About 400 children took part in a flash mob on Wednesday in the Parisian suburb city of Epinay-sur-Seine
SNTV
-
Fleeing Ukrainians 'under attack'
Sky News
-
Blinken in Saudi Arabia
France 24
-
Life on the margins: The fate of Ukraine's forcibly deported children
Euronews
-
Boy dies after ‘isolated incident’ at Lewis Capaldi’s old school
PA Media
-
Pete Davidson responds to PETA's criticism over his pet store purchase
Cover Video
-
Flooding, environment, energy: Consequences of Ukraine dam destruction
France 24
-
Rishi Sunak marks US military ties by laying wreath at Tomb of Unknown Soldier
PA Media