How Much is Your Voice Worth? The Growing Threat of AI Voice Cloning
In our increasingly digital world, the value of a person’s voice is being redefined. ABC News Verify recently demonstrated this by cloning federal senator Jacqui Lambie’s voice for approximately A$100. This simple example illustrates the increasing ease and affordability of AI-powered voice cloning, a technology raising serious concerns about its potential misuse.
The Rise of Deepfakes and Voice Cloning
Artificial intelligence (AI) applications that create synthetic replicas of a person’s image or voice, commonly known as deepfakes or voice clones, are becoming more accessible. This technology poses a significant threat not only to democratic processes, such as elections, but also to personal identity and security. Existing copyright laws in Australia are currently insufficient to protect individuals when their image or voice is replicated digitally without their consent.
Detecting the Deception: The Challenges of Deepfakes
Deepfake technology is becoming alarmingly realistic, making it increasingly challenging to distinguish between what is authentic and what is fabricated. Shockingly, some people who heard the ABC’s voice clone of Senator Lambie initially believed it was genuine. This demonstrates how unauthorized deepfakes and voice cloning can easily generate misinformation, potentially causing severe harm to individuals.
A notable example dates back to 2020, when one of Australia’s first political deepfake videos surfaced, featuring then-Queensland premier Annastacia Palaszczuk. The video, viewed approximately 1 million times on social media, falsely claimed the state was in massive debt. The impact of such false information can be significant.
Legal Frameworks and Their Limitations in Australia
Several Australian laws, including those concerning defamation, privacy, image-based abuse, passing off, and consumer protection, may apply to situations involving deepfake videos or audio clips. Filing a complaint with the eSafety commissioner is also an option. In theory, copyright law might offer some protection for a person’s image and voice, but its application is nuanced.
For example, the ABC needed just 90 seconds of Senator Lambie’s voice to create the AI clone. According to the current law, Senator Lambie, in this situation, does not own the rights to the source material, such as an image or in this case, the voice recording. Even if your image and voice is depicted, if you are not the owner of the source material, you cannot sue for infringement.
Furthermore, Senator Lambie’s voice itself isn’t copyright-protected because copyright generally applies to tangible expressions. Therefore, while the ABC held the copyright to the original 90-second recording as a sound recording, unless a specific agreement existed, Senator Lambie would have no economic rights, nor rights to the subsequent clone.
The AI-generated clone itself is unlikely to be protected by Australian copyright laws because of the lack of identifiable human authorship. This also impacts moral rights, such as the right of attribution and the right of integrity, which may apply to the original audio clip, but not to a deepfake.
Personality Rights: A Potential Solution
In many US jurisdictions, personality rights are recognized, including the right of publicity, which protects an individual’s name, likeness, voice, and other attributes. Celebrities like Bette Midler and Johnny Carson have successfully used these rights to prevent unauthorized commercial use of their identities. However, personality rights may not always apply to AI voice clones, with some legal experts arguing that only recorded voices are protectable, not clones.
This has prompted states like Tennessee to introduce legislation specifically addressing AI-generated content. The Ensuring Likeness, Voice, and Image Security Act of 2024 directly addresses voice misappropriation through use of generative AI. This highlights the global need to provide more legal resources to individuals in these circumstances.
Urgent Need for Action
There is a longstanding scholarly debate about whether Australia should introduce statutory publicity rights, sometimes referred to as ‘personality rights’. Policymakers may be hesitant to introduce a new right due to perceived overlap with existing legal frameworks, which may provide partial legal protection. Therefore, it is important to establish a new law that is well-defined and easy to implement.
Another challenge is the enforcement of these rights when deepfakes are created outside of Australia. Australia could also consider introducing legislation similar to the US “No Fakes Bill,” which would allow people to protect their image and voice through intellectual property rights. This should be considered seriously in Australia.
With deepfakes becoming increasingly common, and utilized during elections, it is essential for Australians to remain vigilant. A federal election is coming up, and we can hope that the winning government will take measures to better protect people’s image and voice.