You are here

Facebook mediation must be proactive, sensitive

Published: 
Wednesday, May 17, 2017
Mental Health Matters

“Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren’t so fortunate.”

These were the words of Facebook (FB) founder Mark Zuckerberg on his FB page earlier this month in an effort to promote the world’s leading social media forum as a “safe community.”

“No one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need,” wrote Zuckerberg.

Facebook is regarded as the largest social media forum globally and its CEO works constantly with proactive and sensitive measures to keep it a safe community. As the forum evolves with its many benefits, Zuckerberg, with the understanding that one negative event mars the brand, constantly tries to enhance resources—financial and human—so that Facebook appears to cover the bases of “do no harm”.

With the World Health Organization (WHO) announcing depression as being the leading disability globally and with T&T being listed as the 41st globally and third in the Americas for death by suicide, there has to be a similar concern here for these issues.

In a post on May 3, Zuckerberg promised to add 3,000 people to his community operations team around the world. This was to be done in addition to the 4,500 that FB already employs to, in his words, “review the millions of reports we get every week, and improve the process for doing it quickly.”

It will be interesting to know how much of Zuckerberg’s monitoring and algorithms are geared towards small population like T&T’s. In other words, are we covered in that outreach especially where T&T itself does not seem to have any investments in appropriate social media use?

Last week’s column dealt with suicidal ideation and the responsibility of Facebook administrators to ensure that responses to those presenting with such dilemmas are met with the appropriate reaction. It was prompted by recent news incidents in which Facebook Live was used in both a murder and a death by suicide elsewhere. I still cannot shake how chilling those incidents remain to me.

Until it has happened in a particular jurisdiction, the idea of cyber suicide or homicide may seem farfetched. Among groups with which I interact, even those who benefit from Zuckerberg’s moderators and algorithms, (and some of these interventions have not filtered into some countries), there are concerns.

A few organisations have placed their own measures publicly to assure members that they are appropriately managing their forums. That is desirable.

Earlier this month, Zuckerberg wrote, “Over the last few weeks, we’ve seen people hurting themselves and others on Facebook—either live or in video posted later. It’s heartbreaking, and I’ve been reflecting on how we can do better for our community.

“If we’re going to build a safe community,” he said, “we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner—whether that’s responding quickly when someone needs help or taking a post down.”

According to Zuckerberg, reviewers will help FB get better at removing things like hate speech and child exploitation. FB also works with “local community groups and law enforcement who are in the best position to help someone if they need it—either because they’re about to harm themselves, or because they’re in danger from someone else.”

 

Intervention/education

 

Rethink Mental Illness (rethink.org), UK, is one group which remains proactive and sensitive in promoting the best use of the Facebook forum. They have set out public guidelines and have given working examples of many ways to support people, especially the suicidal.

“Everything you write, from a proposal to a policy document,” wrote Rethink, “to putting a sign on the fridge, makes a better life possible for someone with mental illness. How does that change how you write it? Can you get it across?”

Many Facebook groups/pages/moderators/users think it is okay to pacify those ideating with suicide with suggestions of prayer, overtones, clichés and pep talks. It is not the suggested response.

As with last week’s column, here’s a rethink.org example if someone posts that they are in crisis:

“When responding to this type of posts on Facebook use a three step method:

• Empathise

• Remind that there is support available

• Signpost

Example:

“Please help me, I don’t think I can go on anymore.”

Response:

“Hi (name), it sounds like things are so tough at the moment. Please know that there is always support available. If you’re having thoughts of hurting yourself it’s important to let your doctor or crisis team know how you are feeling. You can also contact [named organisation] any time of the day or night on [phone number]. If you don’t want to use the phone you can text, email or visit them locally. For details, access their website here [webpage given].

 

• Caroline C Ravello is a strategic communications and media practitioner with over 30 years of proficiency. She holds an MA in Mass Communications and is a candidate for the MSc in Public Health (MPH) from The UWI. Write to: mindful.tt@gmail.com.