In today’s digital age, seeing is no longer believing. Deepfake technologies are shaking up the legal landscape as Canada grapples with how to catch up to these AI imposters. Deepfakes are hyper-realistic media manipulations created using artificial intelligence (AI), where images, audio, or videos, are either digitally altered or fully generated by AI to convincingly replace one person’s likeness with another.[i] These sophisticated synthetic multimedia manipulations are blurring the lines between reality and fiction.
DEEPFAKES IN THE WILD
Not all use of deepfake technology is nefarious or deceptive. Deepfake technology can, and is, being used for entertainment purposes to augment video game characters or develop satirical content.[ii] However, the use of deepfakes spark major concerns when the images, voices and even movements of real people are manipulated to create content that makes it look like the people portrayed are saying or doing things they’ve never done.
A 2023 report analyzing the state of deepfakes found that non-consensual altered pornographic clips constitute 98% of all deepfake videos found online.[iii] In late January 2024, sexually explicit deepfakes of singer Taylor Swift went viral on X (formerly known as Twitter). To prevent further distribution, X made Swift’s name unsearchable for 48 hours. Despite this effort, the images were still viewed millions of times before being taken down, with one of the images viewed more than 45 million times.[iv]
Unfortunately, Swift is just one of many women targeted in these pornographic deepfake videos, a crime for which 99% of the individuals targeted are women.[v] Most do not have the resources of Taylor Swift to enforce the removal of these images from the internet. The creation and dissemination of non-consensual deepfake pornography not only infringes on one’s right to control their own image and identity but can cause irreversible harm to an individual’s reputation and mental health.[vi]
The use of deepfake videos to engage in fraudulent activity is hitting new peaks. In February of this year, an employee of a multinational finance company was tricked into handing over 25 million dollars to criminals who utilized deepfake technology to invite the victim into a video conference call filled with the employee’s colleagues, including the chief financial officer. In the meeting, the employee was instructed to make the transaction. In dutifully following his superiors, he inadvertently handed over $25 million to the criminals.[vii]
It’s not just corporations being targeted. Last year, a family living in Newfoundland and Labrador received a call from a voice they recognized as their son, claiming to be in trouble and needing their urgent help. The couple handed over nearly $10,000 in cash to bail him out of trouble. Meanwhile, his son was at his home and knew nothing of the so-called emergency.[viii] These incidents demonstrate the growing sophistication of fraudulent scams via deepfakes, which can result in significant financial consequences to their victims.
In the last two years, deepfakes of politicians have become increasingly common. In March of this year, a deepfake advertisement depicting Justin Trudeau recommending a cryptocurrency exchange was posted on YouTube. In February of this year, days before Slovakia’s parliamentary elections, a fake audio recording of one of the candidates boasting about how he rigged the election surfaced online.[ix]
In the era where videos go viral globally in mere seconds through platforms like Instagram, TikTok, and X, to audiences who assume that the image they see is a true artifact, deepfakes can easily lead to the spread of misinformation and shape public opinion.[x] The potential use of deepfakes in upcoming elections creates serious concerns that require swift government action.
THE CANADIAN GOVERNMENT’S RESPONSE
The Canadian Government has taken steps to regulate the nefarious uses of deepfake technologies. The most recent legislative developments include:
Bill C-63: The Online Harms Act
On February 26 of this year, the Government of Canada published Bill C-63 which introduces the Online Harms Act (the Bill).[xi] This is the first piece of federal legislation to explicitly address deepfakes. The Bill aims to hold social media services accountable for “harmful content” hosted on their platforms and create stronger online protections for everyone in Canada. Among other things, “harmful content” includes “intimate content communicated without consent” and explicitly includes content of this nature created by deepfakes.[xii] Recognizing the need to regulate the distribution of non-consensual intimate images is long overdue. While several provinces including Alberta, Nova Scotia, Manitoba, British Columbia, Saskatchewan, New Brunswick, Newfoundland and Labrador and Prince Edward Island have passed laws that deal with the distribution of non-consensual intimate images, many of them do not include any specific recourse for deepfakes.[xiii]
Although a step in the right direction, the Bill does not address other forms of harmful content that may be generated by deepfakes aside from sexual content. As the Bill moves through the legislative process it is subject to change, which hopefully will include expanding the scope of harmful content to capture other forms of deepfakes such as those depicting the likeness of individuals expressing opinions that could damage their reputation or credibility.
Bill C-65: Amendments to the Canada Elections Act
On March 20, 2024, the Canadian government tabled Bill C-65, which seeks to amend the Canada Elections Act ( the Elections Act), providing a comprehensive set of measures meant to safeguard election integrity and enhance trust in Canada’s electoral process.[xiv] Although not yet explicitly set out in the first version of the Bill, there have been reports which state that Section 480.1 of the Elections Act directed at the impersonation of certain people involved in the election process will be expanded to cover deepfakes.[xv] In the meantime, Elections Canada has launched an online tool called “Electofacts” to be used by Canadians to verify whether information they have come across about the federal electoral process is accurate.[xvi]
While Bill C-65 and the Electofacts tool marks a significant step forward in diminishing the spread of misinformation, Canada trails behind the United States in addressing this issue. Since January of last year, 41 states have introduced legislation to regulate election-related deepfakes, with eleven states already enacting such laws.[xvii] This underscores a pressing need for Canada to expedite its legislative response to this issue, ideally, before the call for the next federal election.
CONCLUSION
As AI advancements make deepfakes more convincing, our ability to distinguish fact from fiction drastically diminishes.[xviii] It’s time to accelerate our legal response, to ensure a rapid evolution of recourse to contend with the swiftly evolving deepfakes reshaping our digital reality.
Article By: Jennifer Davidson and Victoria Di Felice
This article originally appeared on the OBA Information Technology & Intellectual Property Law Section’s articles page.
[i] Government of Canada, “Deepfakes: A Real Threat to a Canadian Future” (16 Nov 2023), online: Government of Canada <https://www.canada.ca/en/security-intelligence-service/corporate/publications/the-evolution-of-disinformation-a-deepfake-future/deepfakes-a-real-threat-to-a-canadian-future.html - :~:text=Deepfakes are media manipulations that,fully generated by AI 6>.
[ii] Ibid; WIPO, “Artificial intelligence: deepfakes in the entertainment industry” (June 2022), online: WIPO <https://www.wipo.int/wipo_magazine/en/2022/02/article_0003.html>.
[iii] Home Security Heroes, “2023 State of Deepfakes” (2023), online: Home Security Heroes <https://www.homesecurityheroes.com/state-of-deepfakes/ - key-findings>.
[iv] Eva Zhu, “Will the Taylor Swift AI deepfakes finally make governments take action?” (31 Jan 2024), online: CBC <https://www.cbc.ca/arts/commotion/will-the-taylor-swift-ai-deepfakes-finally-make-governments-take-action-1.7100874>.
[v] Supra note 3.
[vi] Halle Nelson, “Taylor Swift and the Dangers of Deepfake Pornography” (8 Feb 2024), online: National Sexual Violence Resource Center <https://www.nsvrc.org/blogs/feminism/taylor-swift-and-dangers-deepfake-pornography#:~:text=According%20to%20an%20AI%2Ddeveloped,altered%20images%20of%20underage%20girls>.
[vii] Heather Chen & Kathleen Magramo, “Finance worker pays out $25 million after video call with deepfake ‘chief financial officer’” (4 Feb 2024), online: CNN <https://www.cnn.com/2024/02/04/asia/deepfake-cfo-scam-hong-kong-intl-hnk/index.html>.
[viii] Mark Quinn, “N.L. family warns others not to fall victim to the same deepfake phone scam that costs them $10K” ( 29 Mar 2023), online: CBC <https://www.cbc.ca/news/canada/newfoundland-labrador/deepfake-phone-scame-1.6793296>.
[ix] Curt Devine, Donie O’Sullivan & Sean Lyngaas, “A fake recording of a candidate saying he’d rigged the election went viral. Experts say it’s only the beginning” (1 Feb 2024), online: CNN <https://www.cnn.com/2024/02/01/politics/election-deepfake-threats-invs/index.html>.
[x] Supra note 1; Government of Canada, “Disinformation, Deepfakes, and the Human Response” (16 Nov 2023), online: Canada <https://www.canada.ca/en/security-intelligence-service/corporate/publications/the-evolution-of-disinformation-a-deepfake-future/disinformation-deepfakes-and-the-human-response.html>.
[xi] Bill C-63, An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts, 1st session, 44th Parliament, 2024 (first reading 26 February 2024).
[xii] Ibid, Online Harms Act, s 2(1): See definition of “intimate content communicated without consent”, subsection (b).
[xiii] Intimate Images Protection Act, SBC 2003, c 11 (BC) ; Protecting Victims of Non-consensual Distribution of Intimate Images Act, RSA 2017, c P-26.9 (AB); Intimate Images and Cyber-protection Act, SNS 2017, c 7(NS); The Intimate Image Protection Act, CCSM c I87 (MB); The Privacy (Intimate Images – Additional Remedies) Amendment Act, 2022, SS 2022, c 29 (SS); Intimate Images Unlawful Distribution Act, SNB 2022, c1 (NB); Intimate Images Protection Act, RSPEI 1988, c I-9.1 (PEI); Intimate Images Protection Act, RSNL 2018, c I-22, (NL).
[xiv] Bill C-65, An Act to amend the Canada Elections Act, 1st session, 44th Parliament, 2024 (first reading 20 March 2024), “Summary” preamble.
[xv] Darren Major, “Liberals introduce legislation amending Elections Act as part of agreement with NDP” (20 Mar 2024), online: CBC https://www.cbc.ca/news/politics/elections-act-update-legislation-1.7149657; Rachel Aiello, “Liberals table elections law reforms aimed at making it easier to vote, harder to meddle” (20 Mar 2024), online: CTV News <https://www.ctvnews.ca/politics/liberals-table-electoral-reform-legislation-that-could-change-the-way-voters-cast-their-ballots-1.6814769>.
[xvi] Elections Canada, “new resource to counter misinformation and distribution about the electoral process” (9 Jan 2024), online: Elections < https://www.elections.ca/content.aspx?section=med&dir=pre&document=jan0924&lang=e>.
[xvii] Public Citizen, “Tracker: State Legislation on Deepfakes in Elections” (last updated 5 April 2024), online: Citizen < https://www.citizen.org/article/tracker-legislation-on-deepfakes-in-elections/>.
[xviii] Patrick Tucker, “Deepfakes Are Getting Better, Easier to Make, and Cheaper” (6 August 2020), online: Defense One <https://www.defenseone.com/technology/2020/08/deepfakes-are-getting-better-easier-make-and-cheaper/167536/>.
Disclaimer: This Newsletter is intended to provide readers with general information on legal developments in the areas of e-commerce, information technology and intellectual property. It is not intended to be a complete statement of the law, nor is it intended to provide legal advice. No person should act or rely upon the information contained in this newsletter without seeking legal advice.
E-TIPS is a registered trade-mark of Deeth Williams Wall LLP.