Social Media

People Are Livestreaming Suicide And Violence—And It Might Be Contagious

Here's what Facebook is doing - and not doing - to keep people from using their platform to broadcast violence and death.

When four black suspects allegedly beat a mentally disabled white man while shouting “f-ck Donald Trump” and "f-ck white people” in Chicago recently, they had an audience. The four suspects, who, according to The New York Times, have been charged with a hate crime, purposefully streamed the whole assault on Facebook Live over a 30-minute video. It was “broadcast for the entire world to see,” said Chicago Police Superintendent Eddie Johnson.

"I've been a cop for 28 years and I've seen things that you shouldn't see,” said Johnson, according to The Chicago Tribune. “It still amazes me how you still see things that you just shouldn’t.”

As long as there has been violence and self-harm, there have been people who have wanted to publicize their violent actions—but never before has it been so easy to show violence to so many people. With the proliferation of cell phones and social media and the ability to live stream from anywhere at any time, cops are no longer the only ones who regularly see “things that you just shouldn’t.” 

“We’ve seen this in history, so that part’s not new,” said Dr. Jane Pearson, the National Institute of Mental Health’s lead expert on suicide prevention, in an interview with Oxygen. “What’s new is the medium.”

In the case of the Chicago assault, the video ran uninterrupted on Facebook for a half an hour, during which time it went viral and was copied and distributed on outlets like YouTube and LiveLeak.

In the video—which I was able to easily find in its entirety—the 18-year-old girl filming the beating appears to reference offended comments that people are making on the livestream as it is being broadcast.

“My little sister says it’s not funny,” she tells the rest of the group during the assault. Later —“What do you mean, ‘what the f-ck?’” “What are you talking about, ‘y’all going to jail?’”

It’s far from the first time that streaming services—particularly Facebook Live, but also Periscope and other livestreaming apps—have been used to broadcast violence. One teen livestreamed her friend’s rape on Periscope, at first apparently in an attempt to help her friend. But according to The New York Times, the prosecutor in the case said once she was streaming “she got caught up in the likes.” Another man livestreamed his getaway from police and ensuing gun battle on Facebook, which is still viewable on outlets like NBC News

And a truly alarming number of people have livestreamed their own suicides, a subject tackled by The Miami Herald after a 14-year-old in foster care, Naika Venant (pictured, above), tragically hung herself in a Facebook Live video that lasted two hours and was watched by hundreds of people. Before Naika's suicide, she had texted: "I"m just tired of my life pointless I don't wanna do this anymore." 

Viewers of suicides that have been livestreamed have mostly watched on in horror, but some have showed disinterest, disbelief or outright encouragement. Naika had "legions of co-conspirators," wrote The Miami Herald. "The hundreds of Facebook 'friends' who watched the video unfold, laughed, mocked her and did nothing to intervene; one true friend did call police."

In one of the first livestreamed suicides, back in 2008, a 19-year-old filmed his suicide by pill overdose on the defunct site Justin.tv. An investigator, Wendy Crane, told ABC News: "People were egging him on and saying things like 'go ahead and do it, faggot.'"  Apparently many watching thought it was a hoax until people realized much later that his body hadn't moved.

“Empathy Gone Awry”

Apart from the sheer cruelty of publicly airing a violent act, a big concern for researchers about broadcasting both assaults and suicide attempts is the risk of a contagion effect. It sounds facile, but it’s real: people who see someone commit suicide or violence against another person are more likely to do the same. 

Most studies about the social contagion effect following suicides and violence have focused on media coverage of such events—one typical study from 2015 found that each mass shooting incident “incites at least 0.30 new incidents” of similar crimes. Though livestreamed violence hasn’t been extensively studied yet, it likely has a similar effect.

“Though we don't yet have direct, high qualities studies about livestreaming, parents should be aware that one highly publicized suicide or violent act can set off a chain of similar behaviors,” said Stephanie Hartselle, MD, with the American Academy of Child and Adolescent Psychiatry.

“Part of that is social norming,” Pearson said. “We give the example of social norming of drinking. You’re more likely to go out and drink if 90 percent of your friends drink, as opposed to 10 percent of your friends. It starts to build in some social position for this kind of behavior.”

The effect may be partially due to “empathy that has gone awry,” Pearson said.

Why livestream violence at all? It’s complicated, experts say. In the case of suicides, it may be a self-preserving instinct, Hartselle said.

“[M]any doctors believe this is a way for people to communicate a hope that someone will intervene and provide them with the help they need,” she said. “Similar to calling friends or family to say goodbye or posting a written message on social media, it can be a conscious or unconscious plea to be saved from their intense or desperate situation.”

Livestreaming violence directed at other people, however, usually has a different rationale, Hartselle added.

“Filming violent behavior may fall into a different category of people who are acting out anger, aggression or even psychotic thoughts in a callous or unemotional way,” she said.

“Reporting” Someone In Danger

Facebook takes a measured view about showing violence through its livestreaming service, permitting it in some instances that it deems newsworthy—for instance, allowing Philando Castile’s girlfriend to livestream his fatal shooting by police last year, in a story that went national and brought increased awareness of police brutality issues. The Chicago assault did not meet this standard.

“We do not allow people to celebrate or glorify crimes on Facebook and have removed the original video for this reason,” said William Nevius, a Facebook spokesman, about the Chicago attacks. “In many instances, though, when people share this type of content, they are doing so to condemn violence or raise awareness about it. In that case, the video would be allowed.”

But the video was left up for at least half an hour—possibly longer—before it was removed by Facebook. A spokeswoman for Facebook would not say how many people had reported the video before its removal.

Last year, Facebook incorporated suicide prevention tools into its platform, designed to provide support for people whose friends have reported their posts as potential self-harm warning signs. And it has expanded its reporting tool—a small arrow on the upper-righthand of any post or video—to be available in all languages that the website supports. When a post is reported, it is passed on to a community operations staff of hundreds based around the world, some of whom have special training in suicide prevention. The team is on-call 24 hours a day, Facebook said.

“[I]f someone does violate our Community Standards while using Live, we want to interrupt these streams as quickly as possible when they're reported to us, so we've given people a way to report violations during a live broadcast,” Nevius said. “We will also notify law enforcement if we see a threat that requires an immediate response, and suggest people contact emergency services themselves if they become aware of a situation where the authorities can help.”

But Facebook said a response to these reports may take as long as 24 hours, even though posts reported as potentially violent or signals of self-harm are reviewed more quickly than other reported posts for rule violations like nudity or spam. That’s potentially enough time to reach someone showing warning signs for suicide, but less effective for reaching someone in the process of harming themselves or someone else.

When you report a post on Facebook, you are asked to put it in one of three categories: “It’s annoying or not interesting,” “I think it shouldn’t be on Facebook,” or “It’s spam.” A livestreamed cry for help is difficult to place in any of those categories, though the second option will eventually take you to a page with resources to reach out to the person who’s struggling. 

Psychiatry experts Oxygen spoke to recommended that Facebook’s suicide prevention resources—which are thoughtfully designed to the point where people struggling with suicidal thoughts can speak to a suicide hotline through Facebook’s own chat service, which Facebook found made people more likely to seek help—be more visible on Facebook’s page, and that the social network use algorithms to flag people using words like “hurt” or “kill” that may be indicative of an intent to harm themselves or someone else. 

But Facebook also takes the stance that some of the responsibility in preventing violence over livestream falls on the audience of these videos.

“[T]he first step,” Hartselle said, “is to call the local law enforcement department.”

If you or someone you know is considering suicide, reach out to the National Suicide Prevention Lifeline at 1-800-8255 (TALK) or text them at 741741. Experts recommend saving these numbers in your cell phone—along with the phone number of a trusted friend or relative and the non-emergency number for the police—in case of an emergency.

[Image: Naika Venant, Facebook]

Crime Time is your destination for breaking crime news, original reporting, and information about Oxygen's programming. Sign up for our Crime Time Newsletter and subscribe to our Martinis & Murder podcast!

All Posts About:
Social Media

You May Also Like...

Recommended by Zergnet