Facebook’s ‘Liam Bot’ Helps Employees Navigate Awkward Holiday Conversations

(via Alex Haney/Unsplash)

There’s nothing worse than fielding awkward questions from family and friends over the holidays—especially if you work at Facebook.

The social media giant has faced such a steady stream of controversies—privacy, censorship, fake news, livestreaming violent incidents, political manipulation—that it’s hard to keep up.

So, just before Thanksgiving, Mark Zuckerberg & Co. rolled out an employee chatbot to help answer tough questions.

If a relative, for instance, asks how Facebook handles hate speech, the chatbot would instruct the worker to answer with the following points:

  • Facebook consults with experts on the matter
  • It has hired more moderators to police its content
  • It is working on AI to spot hate speech
  • Regulation is important for addressing the issue

It would also, according to The New York Times, suggest citing stats from an official report about how the company enforces its standards.

Dubbed the “Liam Bot” (presumably named after the employee who first expressed concern over potentially uncomfortable conversation, though its derivation is unclear), the software merely parrots what company executives have already said publicly.

“Our employees regularly ask for information to use with friends and family on topics that have been in the news, especially around the holidays,” a Facebook spokesperson told the Times. “We put this into a chatbot, which we began testing this spring.”

The company has previously shared news releases in internal groups, or directly with those asking for advice.

This year, though, it introduced Liam Bot—packed with public relations-approved statements that will hopefully (but not likely) appease inquiring minds.

In addition to vague assertions about its practices, the bot links to company blog posts and news releases. It is also “practical with personal technology advice,” the Times said.

More on Geek.com:



from Geek.com https://ift.tt/2Pc8V0X
via IFTTT

0 comments:

Post a Comment