Facebook’s big push to moderate live video won’t succeed unless it does this

0
566

Image: Shutterstock/ KieferPix

Mark Zuckerberg did the right thing when he announced Wednesday that Facebook plans to hire 3,000 people to review reports of objectionable content, including live video that might contain horrible scenes of slaughter and suicide.

The company may have wanted to confine Facebook Live to viral quirkiness like Chewbacca mom and a folk artist embracing Tears for Fears on the hammered dulcimer, or serious streams of election results and people recording and broadcasting unfairnes. The sad reality, nonetheless, is that people have utilized the technology to share grisly footage of little public value and high possibilities for collective trauma. Last week, for example, a man in Thailand broadcast killing his infant daughter and himself.

While Zuckerberg hopes to protect the Facebook community from harm, his announcement elevates big questions about how the company were planning to shield its new hires from the emotional and mental torment of sorting through videos that might construct most of us grovel or weep.

This is an intense, high-stakes challenge, and getting it right means putting numerous policies and practices in place who are interested in everything from hiring strategies designed to mental health benefits to stigma reduction to workplace culture, tells David Ballard, an expert in creating psychologically healthy workplaces and an aide executive director for Organizational Excellence at the American Psychological Association.

“This is new territory, ” Ballard tells of trying to safeguard the mental health of people paid to monitor live video for violence. “Its basically vicarious traumatization. Theyre not in the life-threatening situation themselves, but they’re viewing a situation that’s overwhelming, extreme, and upsetting.”

Facebook isn’t telling much about its policies so far. A spokesperson for the company wouldn’t remark except to say that every person reviewing content on the platform is rendered psychological supporting and wellness resources. There is also a program specifically designed to aid employees who review potentially traumatic content, and it is evaluated annually.

The company has furthermore not commented on whether the new hires will be contractors or employees. The discrepancies between the two experiences can be significant; contractors may be seen as replaceable and be treated as such, whereas employees enjoy perks, advantages, and a more supportive workplace culture.

“Finding a good fit between individuals who can function in that environment, exposed to that type of content, is really going to be important.”

What we do know from previous reporting is that people who moderate online content, including abusive language and still images, are often poorly paid and experience negative psychological outcomes. But much of the research on what it’s like to experience vicarious trauma is on first responders who put out burns and triage medical emergencies. These analyzes don’t offer an easy parallel to the situation that Facebook now faces, where its reviewers will identify and remove live videos of people hurting themselves or others.

Ballard tells the work description for the purposes of our role must be abundantly clear about expectancies so that it’s a right fit both for Facebook and the prospective employee. Emergency responders, for example, regularly undergo a fitness-for-duty evaluation, which determines whether the applicant is physically or psychologically capable of doing the job. Ballard recommends that Facebook, if it hasn’t already, work with an expert knowledgeable in how to identify candidates for high-stress or emotionally taxing jobs.

“Finding a good fit for individuals who can function in that environment, exposed to that type of content, is really going to be important, ” tells Ballard.

It would probably be a mistake, he adds, to hire low-level or entry-level applicants who have little professional experience. Likewise, when new employees start their undertaking, they should receive comprehensive training on what responses to trauma look like and how to deal with them, especially utilizing company-provided resources.

Managers require similar training as well, with a focus on how to talk to employees about trauma, identify signs of emotional disarray or mental illness, and refer people to support.

While Facebook offer psychological supporting and wellness resources, Ballard specifically recommends easy access to an employee assistance program, mental health benefits that are on par with the physical health benefits, workplace education around mental health issues, and stress management programs.

Companies should aim to have these in place irrespective of whether their employees are trying to prevent live video of suicide or slaughter to reach millions of people, but Ballard tells such measures are particularly important in the interests of the work that Facebook reviewers will perform: “In an environment like this, it will be even more critical.”

“Even if they nail it now and get all the pieces in place, its not going to remain static.”

This is hard enough to execute in a company with thousands of people and gets even trickier, tells Ballard, if any of the new reviewers will be contractors operating remotely or in other countries. That introduces complex considerations about how to gauge someone’s well-being and how to address culture differences when it comes to mental health. It likewise elevates serious concerns about whether a contractor can create a workplace environment that mirrors what Facebook employees experience.

Finally, Ballard tells, Facebook will have to consistently evaluate how its employees are coping with the unique pressure of this particular undertaking and whether or not its support is making a positive difference. It sounds like the company is already doing that; its latest move brings that effort to a bigger scale.

Basically, in committing itself to safeguarding customers from traumatic live video, Facebook has just signed up for one of the more difficult duties a social media company could take on. It’s necessary and essential but, as you might have guessed, there’s at least one more obstacle.

“The nature of content will evolve over time, ” tells Ballard. “Even if they nail it now and get all the pieces in place, its not going to remain static.”

So even if Facebook won’t remark in depth on its current tries, expect to have these discussions for a long time to come.

Read more here: http :// mashable.com /