Goodbye job applications, hello dream career
Seize control of your career and design the future you deserve with LW career

Deepfakes, sexbots cannot be ignored by AI laws, expert says

With deepfake pornography and sex robots on the rise, a legal expert said the sex and intimacy industry should be part of the discussion around legislating artificial intelligence.

user iconNaomi Neilson 02 February 2024 Big Law
expand image

While it is encouraging to see legal and government groups establish taskforces and advisory bodies to investigate artificial intelligence (AI) to determine how best to legislate it, Dr Nicole Shackleton said it would be wrong to leave sex and intimacy out of the conversation.

Dr Shackleton, a law lecturer at RMIT University, said AI changed the billion-dollar industry through products like dating apps, sex and intimate toys, and sex robots – or “sexbots” as they’re commonly known – with advanced technologies like Wi-Fi and Bluetooth.

“This industry has been around for a long, long time, but technology has changed it,” Dr Shackleton told Lawyers Weekly.


“It’s not just adding a battery to a vibrator, advanced technologies have allowed the creation of what is called teledildonics.”

Harmony, the world’s first talking sexbot introduced back in 2017, is one major example. Her creator, Matt McMullen, used advanced technologies not only to allow users to have a conversation with Harmony but also to program her personality and memory.

While Harmony appears harmless – and there are a myriad of benefits of introducing AI into sex and intimacy – Dr Shackleton said there are also problems worth exploring – and that means inviting this industry into any discussion about legislating AI.

“The danger of creating a whole new advisory body, which is separate to the advisory body currently being created by the government’s attempt to create better and more safe uses of AI, is that the [government] does not have to turn their mind to these products or to sexual privacy issues,” Dr Shackleton explained.

“Ideally, it would be part of the conversation. The intimacy industry is a legitimate industry and should be at the table.”

Legislating the dark side of AI in sex and intimacy

Just years ago, the conversation around image-based sexual abuse largely centred on “revenge porn”, the criminal act of maliciously distributing private and intimate photographs, videos, and other material without the consent of the subject.

Now the conversation has shifted to “deepfake porn”, a kind of synthetic image or video created using already existing pornographic material and the face or likeness of a real person.

Dr Shackleton explained the deepfake material means “you don’t actually have to see someone naked or receive a naked picture of someone for you to humiliate them”.

In recent weeks, an explicit deepfake image of pop superstar and Times person of the year, Taylor Swift, was distributed on X, formerly known as Twitter. The fake image generated more than 27 million views in 19 hours before the platform blocked all searches.

More recently, Australian major media company Nine Network posted an image of MP Georgie Purcell that had been altered to make her breasts appear bigger and to expose her midriff.

While the network blamed this on “automation by Photoshop”, an Adobe spokesperson told media the edited image “would have required human intervention and approval”.

The edited image (left) and the original image of Georgie Purcell. Photo credit: Nine News.

Dr Shackleton said generated and altered images are not new and have been an “open secret” in online spaces, such as Reddit and 4chan, and shared over encrypted messaging apps like Telegram.

“I think there’s a bit of an assumption that as long as it’s kept in those spaces, it doesn’t matter.

“But then we see these big cases where it does start to come out, and more and more people are seeing it. You also can’t discount the fact it’s been predominantly targeting women for so long and has flown under the radar legislatively,” Dr Shackleton said.

Dr Shackleton said the use of AI in this way “falls within the spectrum of abuse and gendered abuse” and has been designed to shame, humiliate, and damage the reputation of mostly women.

“The idea that AI just altered Georgie’s image without instruction is problematic, but when it comes to Taylor Swift, there’s no coincidence this is happening at a time when she is dating an NFL star, and she’s being seen on the screen by all of these men who do not want her in this space,” Dr Shackleton said.

In its position statement, the eSafety Commissioner noted that 96 per cent of all deepfake material consisted of non-consensual sexual material, and the majority of the subjects were women.

Commissioner Julie Inman warned this meant generative AI images were no longer “the stuff of science fiction”.

Dr Shackleton said laws around image-based sexual abuse in Australia would include deepfake porn because it is still a crime to share sexually explicit images without consent, even if it is altered.

However, Dr Shackleton said there is an attitude of “we’ve got it on the books, so we’re done” from lawmakers.

“If it’s on the books, and it’s still happening, and it’s causing harm, and in fact, it’s growing, then we need to be thinking about different ways of approaching the problem and perhaps treating it around this broader concept of safe and responsible use of AI,” she said.

Other legislative gaps in the sex and intimacy AI space

In addition to making image-based abuse a priority, the discussion should also consider the concerning legislative gaps in the AI used for dating apps, intimate toys and sexbots.

“What’s really important to remember is there’s a heightened concern that people are taking risks because of the benefits that come from using this technology to enhance sexual connection, gratification and intimacy in relationships,” Dr Shackleton said.

“We humans rarely take risks without perceived benefits, but we have to remember there are privacy risks.”

Dr Shackleton said some legislation would already cover parts of this industry. In the most extreme example, for instance, Australia would prohibit the use and importation of sexbots that look like children as child abuse exploitation material is already strongly outlawed.

“But when it comes to the adult space around sex toys and sex robots, there is extremely little law … and there’s no desire to have action around this space,” Dr Shackleton said.

As for AI in dating apps and teledildonics, the concern shifts to the collection of sexual data and its potential misuse.

While there are data protection laws in place, Dr Shackleton said there has never been a lens on sex and intimacy.

There are also concerns about how someone may use AI to create a sexbot that can mimic someone’s likeness.

“There are no laws that would apply to that, other than perhaps defamation, but then someone is using that in the privacy of their own home,” Dr Shackleton explained.

Blanket legislation should not harm benefits of AI in sex

While there are a number of potential harms in introducing AI into the sex and intimacy industry, Dr Shackleton said the “enormous” benefits cannot be discounted – including improving human connectedness, establishing a sense of belonging, and allowing adults to explore sexual expression in a healthy environment.

When legal and government groups are meeting to talk about legislating AI in sex, Dr Shackleton said it is important to avoid creating a “risk-averse model that hampers these benefits”.

“It’s really important we don’t merely reduce sex tech to a negative or risk-based conversation and we consider the broader concept of sex technology with the potential benefits it can deliver.

“This requires nuance, and it does require [sex industry experts] at the table in the beginning of the conversation,” Dr Shackleton said.

Dr Shackleton added that while it is difficult to have an “open and honest” conversation about the sex and intimacy space, human connectedness is a vital part of what keeps people safe and happy.

“We know we’re safer sexually when we add in a discussion about the potential benefits that come from sex, intimacy, and physical touch, and it is the same with technology,” Dr Shackleton said.

“It’s understandable groups may not want to have this discussion because it’s potentially taboo, but I would encourage people looking at this issue to include it as part of a well-rounded conversation.”

Ms Shacketon’s comments on this issue are based on her research work, particularly her work as a researcher on an Australian Research Council Discovery Project (2019–2021), Improving Australia’s legal, policy and educational response to the technological transformation of sex and intimacy, led by Professors Jennifer Power and Anne-Maree Farrell, exploring medical, mechanical, and digital technologies and their impact on the sexual and intimate lives of Australians.