Most people wouldn't even question this wasn't a real person. Photo / Thispersondoesnotexist.com
The majority of internet users probably like to think they could spot a dating scam from a mile away and just can't understand how anyone would fall for such a trick.
But there is a reason so many people fall for catfish or online dating scams, and it isn't because they are dumb or desperate.
Most people are aware of the major indicators of a dating scammer like asking for money, never wanting to video chat and sharing very few pictures of themselves, reports news.com.au.
But scammers are constantly figuring out new ways to make their stories seem more believable and to get people to trust them.
Take this woman for example. She is young and attractive, and it's unlikely many potential love-interests would think twice about chatting with her on a dating app.
But this woman isn't real. And I don't mean in the sense that someone has stolen her picture from social media and is using it without her knowledge on dating apps.
She doesn't exist.
The image was generated by a website called ThisPersonDoesNotExist.com that uses AI technology to randomly generate realistic-looking human faces.
Each time you refresh the page a new "person" is created.
While a single picture on its own might not seem a very big threat, when you combine it with the constant advances in deepfake technology there is a real cause for concern.
Deepfake is an AI-based technology that produces hyper realistic images and videos of situations that never happened.
These videos look so realistic it is hard to prove they are fake.
A recent example of the major issues this technology can cause is when a video made the rounds last year of Barack Obama appearing to call Donald Trump a "dipsh*t".
There are certain points where you can see blurring or distortion on the video that indicates it isn't real, but it gives an idea of just how dangerous this technology can be.
With this in mind, there is increasing potential for scammers to use AI-generated images and create a whole new person.
Phillip Wang, the man behind the website ThisPersonDoesNotExist.com, told news.com.au he created it to prove a point to friends about AI technology.
"I then decided to share it on an AI Facebook group to raise awareness for the current state of the art for this technology. It went viral from there," Mr Wang said.
When asked if he had any concerns about people using the images to catfish or scam others, he said that concern already existed long before the website was made.
"Anyone can download the code and the model and instantly start generating faces on their own machine," he said.
Mr Wang said creating a site where people could understand just how easy it was to make a fake person was helping to raise awareness about the implications this kind of technology might have in the future.
He said it was becoming increasingly difficult to tell deepfakes from reality, and it was "beyond something that simple photoshop forensics can help defeat".
HOW IT IS BEING USED NOW
There are an increasing number of cases of deepfakes being used to create fake revenge or celebrity porn.
Zach, a senior reputation analyst at Internet Removals, an organisation that helps people get sensitive content offline, said they first encountered deepfakes in 2017.
"One of our staff was alerted to naked pictures of this A-list celebrity being shared around the internet. We looked it up and there were tonnes of images, and we just couldn't wrap our heads around how it was being done," he told news.com.au.
"We didn't know what we were dealing with. We initially thought it was a group of sick individuals manually photoshopping each picture, which would take a very long time."
Unfortunately, there is very little people can do to protect themselves from becoming targets of these online attacks. And even getting the photos taken down once they are created can be difficult.
"The person who created the image is often protected as they are seen as being the author of the work as the image is technically created by them," Zach said.
"It can already be a tricky process to get images removed from the internet, but it becomes even harder when deepfake is involved."
There are already signs of how scammers are using this technology to their advantage.
Zach said their team came across a scammer on Tinder that encouraged people to video chat with them. Usually, this is something a scammer or bot tries to avoid as the person they are talking to will realise they aren't real.
But when they accepted the video chat it showed a woman undressing and encouraging the other person to do the same.
The only indication that something was wrong was the audio didn't match up to the movement of the woman's mouth.