Naomi Sayers |
I was nervous. I prepared and got my testimony to between seven and 10 minutes, if I recall. I flew in and I flew out. It was fast. I was on the first day, after the then-justice minister Peter MacKay. When I testified, I struggled to get the words out because I remember all the lives that were lost to get to that point. When I was done, I was swept into media and thrust into live tapings and interviews. Then it was done, and soon after, I got a notification on my phone.
I had nudes connected to my identity on social media. I was sick to my stomach. I called the police then, and they said, “Nothing we can do.”
When I read the recent news article about the group of teenage girls who were victims of a boy who created “deep fake” porn using their identities and social media, I knew all too well what they were going through. For the Toronto police to say, using some sort of PowerPoint, to these girls that there is nothing they can do is plainly wrong.
The publicly available facts are, generally, a boy had created pornographic images using the girls’ photos that he pulled from social media and for one girl, a photo that was sent to his SnapChat. The discovery happened at a coed slumber party where the phone in question was reviewed by others.
The news of the images got out, and a number of young people either saw the photos or knew of the photos. It is not clear. The young people who discovered the photos on the phone created video evidence. The officer who visited the boy’s home allegedly said the phone was wiped clean. Whatever that means.
There are tools that are available that I often see the police use against my clients when it involves technology and social media. Everything from phone extractions to uncovering supposedly deleted data sitting on the back end and then, police use search warrants for other devices or orders demanding production from third-party applications.
It seems that the police are hanging their hat on the unknown: Whether the boy “communicated” the images to others. As it presently stands, there is no evidence that has been publicly communicated that the police have examined the phone, or any other devices connected to it and the applications used on the phone. Examining the phone would be at minimum something required to do in this instance. If it was done, it should be communicated or should have been communicated to the victims. The police are trusting the good word of the boy and others in question. This is not the standard of a police investigation: a police investigation requires police to explore all avenues for evidence and to ensure all evidence is preserved, which includes seizing the phone, examining the phone and seeking production orders from the third-party applications.
As a reminder, laying charges against a person or persons requires meeting a higher standard — this means generally that all elements of an offence must be met or could be met, to put it in layperson’s terms. I have seen a lot of the discussion surrounding child pornography and that is one option, but there are also others.
For example, there is the public incitement of hatred. This offence calls for an identifiable group, which in this case includes young girls and young women, and that statements must be communicated in a public place or other than a private conversation to incite hatred.
Hatred is defined in R. v. Keegstra, [1990] S.C.J. No. 131:
Hatred is predicated on destruction, and hatred against identifiable groups therefore thrives on insensitivity, bigotry and destruction of both the target group and of the values of our society. Hatred in this sense is a most extreme emotion that belies reason; an emotion that, if exercised against members of an identifiable group, implies that those individuals are to be despised, scorned, denied respect and made subject to ill-treatment on the basis of group affiliation.
It appears that the deep fake porn is furthering this dominant and patriarchal narrative that women and girls have very little control over their bodies, and physical and psychological integrity. Even when their psychological integrity is impacted, there is little the police will do (other than make a PowerPoint). Having the police build a PowerPoint to say here is why we can’t do anything doesn’t make the decision the right and just one. It is simply a distraction from the police handling of the matter.
There is plenty of literature and jurisprudence that recognizes sexual violence against women and girls as omnipresent in Canadian society and that the Supreme Court of Canada has repeatedly recognized that Canadian society is committed to protecting the personal integrity, both physical and psychological, of every individual. And repeatedly, Canada’s highest court has held that “the common law has recognized for centuries that the individual’s right to physical integrity is a fundamental principle, “every man’s person being sacred, and no other having a right to meddle with it, in any the slightest manner” — physical integrity also includes psychological integrity (R. v. Ewanchuk [1999] 1 S.C.R. 330).
To say that the deep fake porn isn’t an effort to incite hatred against a specific class of people is to ignore how this conduct is specifically targeted at women and girls, specifically underaged girls, who are especially vulnerable. Young people have a right to exist online and free to express themselves as they see fit within limits and there is no suggestion whatsoever they were committing any offensive or criminal conduct — their images were taken and used for a nefarious purpose. Their online identity is an extension of who they are in Canadian society and this is a right to be protected.
There is also mischief in computer data, which could evidently fall under the provision relating to obstructing, interrupting or interfering with the lawful use of computer data. Computer data is defined in another section of the Criminal Code to mean representations, including signs, signals or symbols, that are in a form suitable for processing in a computer system. It could include a phone that has an operating system, data processing or programs that are capable of performing a variety of functions, like social media applications. Women and girls do not post their photos online to be used unlawfully; because they post online doesn’t make it okay for others to do what they please. Any reviewing court could reasonably take judicial notice about the impacts of having someone’s images used to create fake porn.
It appears based on what is publicly available that the boy created the deep fake porn by using social media images and then apparently to the police, the phone was wiped. Based on my experience with phones being examined and applications working with police in investigation, there are specific programs used to extract even “wiped” phones and orders sought from third-party applications. There is no suggestion by the police that any such orders were sought to examine the applications nor that any phone was preserved for examination. It is not even clear how or what applications were used to create the deep fake porn.
As a reminder, this isn’t about whether these images were stolen but rather the unlawful use of these images for another purpose without consent. It is very evident the girls did not consent and to use their images to create deep fake porn is quintessentially unlawful behaviour.
In the end, there is also harassment where the creation of the deep fake porn is knowingly engaging in a conduct that is being communicated and engaging in threatening conduct. The threatening conduct being the creation and communication of deep fake porn, sending the signal to the young girls that they are being watched online, and their photos are capable of being used for whatever purpose another desires, including an unlawful purpose. And, the police will simply say, “Nothing we can do.” The girls felt scared and helpless and this is exactly the type of response this form of communication — the deep fake porn — was intended to create. It is hard to conceive how a force like Toronto police is unable to find any offence in the Criminal Code, a piece of legislation they are expected to enforce and uphold.
This is an example of an urgent call for police to understand how violence is used through technological means — we don’t need PowerPoints created by police.
In the end, I do not agree with the calls for amendments of our legislation. I think there is enough in the Criminal Code capable of supporting laying a number of charges. I think that there needs to be more education about how technology and violence is inflicted against vulnerable groups, like underage girls.
Naomi Sayers is an Indigenous lawyer from the Garden River First Nation with her own public law practice. She sometimes teaches primarily on Indigenous rights and governance issues. She tweets under the moniker @kwetoday.
The opinions expressed are those of the author(s) and do not necessarily reflect the views of the author's firm, its clients, Law360 Canada, LexisNexis Canada, or any of its or their respective affiliates. This article is for general information purposes and is not intended to be and should not be taken as legal advice.
Interested in writing for us? To learn more about how you can add your voice to Law360 Canada contact Analysis Editor Peter Carter at peter.carter@lexisnexis.ca or call 647-776-6740.