IMAGINE opening your laptop one morning to find your face superimposed on a different body in a pornographic video, it isn’t you in the footage – but it looks like it to anyone watching.

The clip is spreading, messages from friends and family are starting to flood in, there’s a risk you could go viral, your life changed irreparably without your consent.

That was the reality for two female Twitch streamers, the popular broadcasting website for gamers, Pokimane and QTCinderella who found themselves at the centre of a deepfake storm last week.

Fellow streamer Brandon Ewing, who goes by Atrioc online and has 319,000 followers on the platform, faced a furious backlash after he was caught out visiting a pornographic website which sells deepfake content of fellow creators.

READ MORE: MSP to challenge BBC over Question Time trans rights debate

He had inadvertently shared a browser window showing the website to his followers on Monday, with a tearful apology quick to follow and subsequently going viral.

But the damage had already been done – not only had Atrioc drawn attention to the existence of the site, later taken down by its creator after a surge of outrage, but the impact on the women caught up in the row was already evident.

QTCinderella, in a livestream on January 30, bravely spoke out because she wanted to show the harm it had caused her.

“This is what it looks like to feel violated, this is what it looks like to feel taken advantage of.

“This is what it looks like to see yourself naked against your will being spread all over the internet,” she told her followers.

What are deepfakes?

Artificial Intelligence (AI) technology has advanced to the point where users can now take a piece of already existing pornographic material and essentially graft the face of anyone they wish on to the actor or actress. This could be either video or photographs and is becoming more realistic as the technology progresses. According to the Scottish Women’s Rights Centre (SWRC), this fake pornography is “almost always” of women and created without their knowledge or consent.

The technology instantly became controversial when it appeared in 2017 and has frequently been used at the expense of female celebrities, and now Twitch streamers.

One popular app, DeepNude, was removed from the internet in July 2019 by its creator but copies of the software were still widely available online afterwards, and numerous other websites have popped up in its place since.

There have been concerns raised about the potential increase of deepfakes being used to create revenge porn, another form of harassment and sexual violence against women, now taking place online.

Sensity, an AI company which detects deep fakes, previously claimed the number of videos using this technology doubles every six months on the internet, and that the number of people affected has increased by a third each year since 2019.

What is the law on deep fake pornography in Scotland?

There are existing protections in Scots Law to deal with deep fake pornography, which was previously described as “ahead of the game” than the rest of the UK.

In 2016, an offence of “non-consensual sharing of intimate images” was introduced in Scotland, and while it is not a crime to share intimate videos with another consenting adult – it is if you share images which show, or appear to show, another person in an intimate situation without that person’s consent.

The provisions which relate to “appear to show” give the police and courts the ability to pursue convictions in relation to altered images and deepfakes, but the law does not go so far as to include completely computer-generated images such as animations, cartoons and graphics that don’t include existing images.

The Scottish Government told the Sunday National that it is “not possible” to identify from recorded statistics if there had been any cases of deepfake pornography the courts in Scotland.

But could further protections be brought in to combat this emerging threat?

“While ultimately a matter for the courts, the offence is relevant in the context of faked images appearing to show a person in an intimate situation,” a Scottish Government spokesperson said.

They added: “Baroness Kennedy’s report on misogyny recommended the creation of a statutory sentencing aggravation concerning misogyny.

“As set out in the Programme for Government, we will consult on specific draft laws to implement recommendations no later than summer 2023, with a view to introducing a Misogyny and Criminal Justice Bill later in this Parliament.”

When Kennedy launched her report Misogyny: A Human Rights Issue on March 8 last year, we asked if added protections for deep fake pornography would be part of the considerations. Kennedy’s report called for a statutory aggravator for misogyny and the creation of a new offence of stirring up hatred against women and girls.

If the recommendations became law, and there was a case relating to deep fake porn, this added aggravator could lead to a harsher sentence.

READ MORE: UK Genetic Technology Bill to let Tory government change Scottish law

She explained at the press conference launching the report: “There is law to prosecute that already, but if it was considered deeply misogynistic, which it usually is, then the aggravation would come into play.

“So, abusing images is already in law but you could use the aggravation if the courts felt that was the appropriate thing to do.”

However, Kennedy’s recommendations suggested this aggravator should not apply in cases of rape, other sexual offences and domestic abuse, as the misogynistic element is already recognised, which the Scottish Government said an upcoming consultation will seek views on.

The Crown Office and Procurator Fiscal Service (COPFS) said they recognised the “devastating impact” non-consensual sharing of images can have on victims.

“It is a priority for all prosecutors that we take action against offenders who share images that show or appear to show a person in an intimate situation without their consent, using all of the tools at our disposal within the existing legal framework,” a spokesperson said.

“We will regularly consult partners on emerging forms of offences which use technology to abuse and harass, ensuring victims are supported and that perpetrators are brought to justice.”

Police Scotland said there have been no “specific instances” of deep fake pornography reported to them, but they are “aware of the issue”, and encouraged any potential victims to report the crime.

What about the rest of the UK, will the Online Safety Bill cover deep fakes?

Yes, the Tories’ Online Safety Bill will criminalise deepfake pornography, downblousing, taking a photo down someone’s top without consent, and upskirting, the equivalent but up a skirt. The bill will also establish offences for cyber-flashing, the sending of unsolicited explicit images, and extending revenge porn laws to include the threat to share images, as well as actually sharing them.

The additions to the bill came from recommendations from the Law Commission, who suggested creating stronger offences for sharing intimate images without consent.

The legislation has passed all stages in the House of Commons, the first and second reading in the House of Lords and is now at committee stage.

How to get help

IF you find yourself in the position where intimate images, real or digitally altered, have been shared without your consent, there are a number of ways you can seek help.

Police Scotland said that while no specific cases of deepfake pornography had been reported to them, they would encourage any victims to come forward.

It is a crime in Scotland to share images which show, or appear to show, another person in an intimate situation without that person’s consent.

“We recognise that such offences may be extremely distressing and would urge anyone who may have been a victim of this type of crime to report this to police,” a spokesperson for the force said.

The Scottish Women’s Rights Centre also has an online guide setting out your rights if intimate images or videos are shared without consent. 

And, support is available through Rape Crisis Scotland and their helpline.

Sandy Brindley, chief executive of the charity, said: “It’s really important that the law keeps up with emerging technologies to protect victims of any kind of image-based abuse.

“Support for anyone who has experienced this kind of abuse, or any kind of sexual violence, is available every evening from the Rape Crisis Scotland helpline on 08088 010302 from 5pm-midnight.

“No matter how long ago it happened, when you’re ready to talk, our trained support workers are ready to listen.”