Offenders creating and distributing sexually explicit deepfake material may face up to seven years of imprisonment following the passage of new legislation in Parliament.
Individuals involved in producing and circulating sexually explicit deepfake content now risk being incarcerated, as new laws have been approved by Australia’s federal parliament.
The Criminal Code Amendment (Deepfake Sexual Material) Bill 2024 was introduced in June and was officially accepted on Aug. 21.
This bill will enforce strict criminal penalties on those who utilize artificial intelligence software and applications to fabricate fake pornographic content by using a victim’s likeness or superimposing their face onto explicit material.
Upon the bill’s introduction, Australia’s Attorney-General Mark Dreyfus highlighted that women and girls are frequently targeted and demeaned through such means.
The legislation will bolster existing Commonwealth Criminal Code offenses and introduce a new aggravated criminal offense for the dissemination of such content.
“These offenses will carry severe criminal penalties of up to six years’ imprisonment for the sharing of non-consensual deepfake sexually explicit material,” stated Dreyfus in a release.
In cases where the individual also created the non-consensual deepfake content being shared, an aggravated offense will be in place with a higher penalty of seven years’ imprisonment.
The law will also encompass the sharing of authentic images that have been distributed without consent.
“The new criminal offenses are structured around a consent model to adequately cover both artificial and authentic sexual material,” remarked Labor Senator Murray Watt during a Senate session on Aug. 21.
Shadow Attorney-General Michaelia Cash expressed reservations regarding certain aspects of the bill, particularly the potential cross-examination of victims in court.
This deepfake legislation complements other government efforts to combat cyberbullying and harm, including increased funding for the eSafety commissioner, an early review of the Online Safety Act, and a commitment to addressing practices like doxxing.
The rise in technological capabilities has resulted in a surge of deepfake image cases online, with a significant portion being sexually explicit in nature.
A parliamentary inquiry revealed that the majority of deepfakes, around 90 to 95 percent, involve non-consensual pornography, with women accounting for 99 percent of the victims.
eSafety Commissioner Julie Inman Grant noted the prevalence of apps facilitating the creation of such content.
Senator Kerrynne Liddle from South Australia informed Parliament that a deepfake image could be generated in as little as two seconds.
“Deepfake imagery can have detrimental impacts on adults’ careers, but when exploited by criminals against our children, the consequences can be—and have been—fatal,” she emphasized.
“Australia’s eSafety commissioner has estimated a drastic increase of up to 550 percent in deepfake imagery annually since 2019.”
Liddle, serving as the shadow spokesperson for child protection, recounted instances where criminals leveraged deepfakes to extort money.
“Research released in June indicated that one in seven adults, equating to 14 percent, has faced threats of intimate image sharing,” she shared.
“In a global survey involving Australia, over 70 percent of individuals were unaware of what a deepfake entails, underscoring the need for increased public education.”
“It is imperative that young people grasp the severity and harm caused by such actions. We must intensify efforts to safeguard children from falling victim to, or becoming perpetrators of, these offenses,” she concluded.