News

Even “living offline” isn’t enough to combat deepfakes, as experts reveal the only way to defeat the “inevitable” AI threat

Share on facebook
Share on twitter
Share on linkedin
Share on pinterest
Share on telegram
Share on email
Share on reddit
Share on whatsapp
Share on telegram


LIVING completely offline is not enough to avoid being caught in a deepfake attack, experts have warned.

It is now so easy to create fraudulent videos – also known as deepfakes – using artificial intelligence that almost anyone is at risk.

1

It’s becoming increasingly simpler (and faster) to create convincing deepfake videosCredit: Getty

Deepfakes leverage AI applications to create videos that show people doing (or saying) things they didn’t do.

Typically, they train an image of a person and then reproduce their face on another person’s body.

And now deepfakes are so advanced that they can be created from a single photo – turning it into a video in minutes.

AI apps can even convincingly replicate your voice based on just a few seconds of audio.

The US Sun spoke to cybersecurity expert Adam Pilton, who warned that you risk “losing control” of your image.

“Risk has always existed on the internet: as soon as you upload material, be it a photo, text, audio or anything else, you lose control over it,” said Adam, cybersecurity consultant at CyberSmart and former cybercriminal. detective.

“And anyone can do whatever they want with it and display it in any context, positively or negatively.

“This is also the case with deepfakes and the idea now of preventing or reducing the images we put online is too late, because this information for most people is already available.

“And we therefore need to adjust to a new method of accepting that deepfakes will inevitably happen unless you live completely offline.

“And instead, get better at spotting the signs that you’re seeing deception.”

Two changes to your conversations to stop ‘AI voice cloning’ phone calls from emptying your bank account – get them ready

Deepfakes will only become easier to create on future.

And so being able to spot the signs that you’re watching a scam video will be key.

It might seem scary enough to force you offline – but even that might not save you.

“Even if you live offline, there is no way to control the content that appears online, as anyone can upload whatever they want,” warned Adam.

Deepfakes – what are they and how do they work?

Here’s what you need to know…

  • Deepfakes are fake videos of people that look perfectly real
  • They are made using computers to generate convincing representations of events that never happened
  • Often this involves swapping one person’s face with another’s or making them say whatever you want.
  • The process begins by feeding an AI with hundreds or even thousands of photos of the victim
  • A machine learning algorithm swaps certain parts frame by frame until it generates a realistic but fake photo or video
  • In a famous deepfake clip, comedian Jordan Peele created a realistic video of Barack Obama in which the former president called Donald Trump an “imbecile.”
  • In another, Will Smith’s face is pasted onto the character Neo in the action film The Matrix. Smith turned down the role to star in the failed film Wild Wild West, while the Matrix role went to Keanu Reeves

“The reality of living in the modern world with AI is that over time, images, text and any information about you will be online, AI will consume it and probably use it.”

PRETENDING

Adam said deepfakes are more likely to target well-known people – such as celebrities and politicians.

That’s because there’s more to gain by replicating these people’s similarities.

So Adam says you’d better worry about the dangers of seeing deepfakes — rather than being turned into one.

DEFENSE AGAINST DEEPFAKES

Here’s what Sean Keachhead of technology and science at The Sun and The US Sun, has this to say…

The rise of deepfakes is one of the most worrying trends in online security.

Deepfake technology can create videos of you even from a single photo – so almost no one is safe.

But although it seems a little desperate, the rapid rise of deepfakes has some advantages.

For starters, there is much greater awareness about deepfakes now.

Therefore, people will look for signs that a video may be faked.

Likewise, technology companies are investing time and money in software that can detect fake AI content.

This means social media will be able to flag false content for you with greater confidence – and more frequently.

As the quality of deepfakes increases, you will likely have difficulty spotting visual errors – especially a few years from now.

So your best defense is your common sense: apply thorough scrutiny to everything you watch online.

Ask if the video is something that would make sense for someone to fake – and who benefits from you seeing that clip?

If you’ve been told something alarming, a person is saying something that seems strange, or you’re being led into a hasty action, there’s a chance you’re watching a fraudulent clip.

Being cautious about what you see online can protect you from sinister and false scams and advertising.

“In the short term, the average person is less likely to be subject to a deepfake, we are more likely to see people who are well-known and influential,” he told The US Sun.

“The broader near-term threat, as a consumer of online information, is recognizing what may be a deep fake.

“And whether you are going to give any weight to the information that this potential deep fake has provided or even whether you are going to take action based on that information.”



This story originally appeared on The-sun.com read the full story

Support fearless, independent journalism

We are not owned by a billionaire or shareholders – our readers support us. Donate any amount over $2. BNC Global Media Group is a global news organization that delivers fearless investigative journalism to discerning readers like you! Help us to continue publishing daily.

Support us just once

We accept support of any size, at any time – you name it for $2 or more.

Related

More

1 2 3 6,003

Don't Miss