Legal reform is needed to protect young women from the growing threats of online sexual violence
by Anthony Fong, Global Journalism Fellow, Dalla Lana School of Public Health, University of Toronto
Terminology is important. According to Rosel Kim, staff lawyer at the Women’s Legal Education and Action Fund in Toronto, terms such as “cyberviolence” downplay the severity of the act.
The increase in online interaction created by COVID-19 has generated a spike in girls and young women being subjected to what’s called technology-facilitated sexual violence (TFSV). The term refers to everything from sharing someone’s nude photos without their consent to sending unsolicited pictures of one’s own genitals.
TFSV, a range of harmful and sexually aggressive online behaviours, affects 88 per cent of all Canadian university undergraduate women. Younger teens are also affected by these behaviours.
Survivors have few legal options, and have recently been found to be at higher risk of suicide. This highlights the need for more education and legal reform around these acts, which some legal experts say should be criminal.
“No matter how much you think you’re protecting your child, they can still get to them,” says Heather Mackie of Vancouver, B.C., whose name has been changed to protect the identity of her daughter.
Two years ago, Mackie’s then 12-year-old daughter, Emma (not her real name), created an Instagram account for her fictional character on Roblox, a popular online gaming platform whose users are mostly under 16 years old. What Emma next received in her inbox shocked her, and her mother.
“It was a picture of a man’s genitals,” says Mackie. Emma was visibly upset. “She deleted it and blocked him. We then deleted the account.”
Experiences like Emma’s are common. A recent Canadian survey of university-aged women found 6.4 per cent had their first experience with online sexual harassment between 12 and 14 years of age.
Experts differ slightly in how they classify forms of TFSV, with one classification including image-based sexual abuse (non-consensual sharing of victims’ images), video voyeurism and unsolicited sexual images, which is what Emma received.
Another definition adds online sexual aggression and coercion, including extortion, blackmail and bribery, as well as online harassment of people based on their gender or sexuality.
Terminology is important. According to Rosel Kim, staff lawyer at the Women’s Legal Education and Action Fund in Toronto, terms such as “cyberviolence” downplay the severity of the act. “Cyberviolence is not separate from violence,” she says.
Another term, “revenge porn,” blames its victims, and is better described as a form of image-based sexual abuse.
Which brings us back to Emma, who a year later had a second incident. She was on the now-defunct social networking app Houseparty and witnessed a friend being harassed online.
“The language they used was shocking,” says Mackie, whose daughter took screenshots of the chat and reported the incident to the police liaison officer at her school. The bully had sent an image depicting anal penetration of a popular children’s cartoon character, and the rest was “mostly words telling her to go kill herself.”
Words that, as it turns out, can lead to real harm.
Online violence and suicide
“Sexual violence has been around forever, but the context has shifted (online),” says Amanda Champion, a criminology PhD candidate at Simon Fraser University in Burnaby, B.C.
Champion is the co-author of a 2021 study that clarified the psychological link between TFSV and suicide. According to her findings, TFSV victims’ public exposure makes them targets for bullying, which can lead to depression and the feeling that they’re a burden to friends and family.
This “perceived burdensomeness” leads victims to “believe that you’re so much of a burden that your death is worth more than your life,” which opens the door to suicide, Champion says.
In Canada, this process was starkly illustrated in 2012, when 15-year-old Amanda Todd died by suicide after a nude screenshot of her was shared online without her consent. A year later, 17-year-old Rehtaeh Parsons, who was allegedly raped and then bullied over shared photos of the assault, also ended her own life.
In light of these stories, lawyers have been pondering how to hold perpetrators accountable for TFSV while protecting survivors.
In Canada, not a lot of people realize they can report TFSV to the police, says Suzie Dunn, a law and technology professor at Dalhousie University in Halifax. “It’s downplayed by society and even by police. People are still conceptualizing whether or not these are true harms,” she says.
When it comes to legal options, Kim and Dunn say the key is understanding what TFSV victims’ goals are. “Maybe they want images to be taken down, or an apology — not necessarily to put a person in jail,” says Kim.
Under the Canadian Criminal Code, an offender may be charged with voyeurism, obscene publication, criminal harassment, extortion or defamatory libel. However, if the only goal is to have harmful content taken down, then pursuing a criminal charge may be more trouble than it’s worth, Kim says.
The first barrier is convincing the police that there’s enough evidence to charge an offender. Then once in court, “you have to prove beyond a reasonable doubt” — a high burden of proof, says Kim. During trial, the accused’s lawyer may also expose a survivor to further trauma.
Lastly, the criminal justice system moves slowly — and without a conviction, harmful content stays up, says Dunn.
The need for legal reform
Dunn says Canada lags behind other nations such as Australia when it comes to education, research and legislation around TFSV.
Since 2015, Australia has had an eSafety Commissioner, “the world’s first government agency committed to keeping its citizens safer online.” Kim and Dunn say Canada should have a similar government-funded statutory body that advocates in this area.
Starting points for advocacy may include implementing more expedient image take-down laws and regulating social media companies such as Facebook, agree Dunn and Kim.
“These platforms make money through engagement. What’s engaging content is often extreme content that tends to be abusive or violent,” says Kim.
In last fall’s federal election, the Liberals promised to rework online harms legislation within 100 days of Parliament’s Nov. 22 return — that timer is set to expire on March 2.