The Australian Attorney-General Mark Dreyfus introduced the Criminal Code Amendment (Deepfake Sexual Material) Bill 2024 to the parliament on Wednesday. Dreyfus and Australian Prime Minister Anthony Albanese both contended that non-consensual deepfake sexually explicit materials are a “damaging and deeply distressing form of abuse” and therefore should attach serious criminal penalties. The bill has been introduced and read in parliament and is awaiting a second reading.
According to the bill, the offense will carry a charge of up to six years imprisonment for sharing non-consensual sexually explicit deepfake content. The bill also provides two aggravated offenses targeting repeat offenders and the creators of the content. Both aggravated offenses carry a potential sentence of up to seven years imprisonment. On the other hand, the new offenses will solely apply to sexual material featuring adults and child abuse material will continue to be handled under specific, separate charges.
Dreyfus, during an interview on local radio, acknowledged the potential difficulties in identifying and prosecuting those who share deepfakes online because of the anonymous nature of social media accounts. He also affirmed that technological means of tracing could overcome difficulties in detection and prosecution. Therefore, the difficulties do not translate to a reason for not prohibiting the creation of non-consensual deepfake sexually explicit materials and imposing serious criminal liabilities on this socially undesirable activity.
In addition to the new bill, the government also pledges to expand on previous initiatives such as increasing funding for the eSafety Commissioner, addressing harmful conduct like doxxing and revising the Privacy Act to provide more control over personal data to all Australians, especially women who are victims of domestic abuse.
Discussions of introducing this bill began in May when federal leaders met to address Australia’s gendered violence crisis. Albanese stated that the bill is intended to “keep women safe” referring to the growing number of pornographic deepfake content being made of women in Australia. A 2023 report by a social media analytics firm Graphika found a 2000 per cent rise in the number of websites generating non-consensual sexual images using artificial intelligence. The federal government also committed AU$1 billion in funding towards the cause.