TikTok, the popular social media app owned by Chinese tech company ByteDance, has been under a national security investigation by U.S. lawmakers who have raised concerns about the company’s access to U.S. user data and whether it was censoring content at the behest of the Chinese government. Today, TikTok tries to combat these concerns with the opening of a “Transparency Center” that will allow outside experts to examine and verify TikTok’s practices.
The new facility in TikTok’s L.A. office will allow outside experts to view how TikTok’s teams operate day-to-day, the company explains, as staff moderates content on the platform. This includes how moderators apply TikTok’s content Guidelines to review the content its technology automatically flagged for review, as well as other content the technology may have missed.
In addition, the experts will be shown how users and creators are able to bring concerns to TikTok and how those concerns are handled. TikTok will also explain how the content on the platform aligns with its Guidelines, the company says.
This center mainly aims to address the censorship concerns the U.S. has with TikTok, which, as a Chinese-owned company may have to comply with “state intelligence work,” according to local laws, experts have said. TikTok has long denied that’s the case, claiming that no governments — foreign or domestic — have directed its content moderation practices.
That being said, The Washington Post reported last year that searches on TikTok revealed far fewer videos of the Hong Kong protests than expected, prompting suspicions that censorship was taking place. The Guardian, meanwhile, came across a set of content guidelines for TikTok that appeared to advance Chinese foreign policy through the app. TikTok said these guidelines were older and no longer used.
Today, TikTok’s moderation practices are still being questioned, however. In November, it removed a video that criticized China’s treatment of Muslims, for example. The video was restored after press coverage, with TikTok citing a “human moderation error” for the removal.
While the larger concern to U.S. lawmakers is potential for China’s influence through social media, TikTok at times makes other moderation choices that don’t appear to be in line with U.S. values. For example, singer Lizzo recently shaded TikTok for removing videos of her wearing a bathing suit, even as TikTok stars posted videos of themselves dancing in their bathing suits. (The deleted video was later restored after press coverage). The BBC also reported that transgender users were having their posts or sounds removed by TikTok, and the company couldn’t properly explain why. And The Guardian reported on bans of pro-LGBT content. Again, TikTok said the guidelines being referenced in the article were no longer in use.
TikTok says the new transparency center will not only allow the experts to watch but also provide input about the company’s moderation practices.
“We expect the Transparency Center to operate as a forum where observers will be able to provide meaningful feedback on our practices. Our landscape and industry is rapidly evolving, and we are aware that our systems, policies and practices are not flawless, which is why we are committed to constant improvement,” said TikTok U.S. General Manager, Vanessa Pappas. “We look forward to hosting experts from around the world and continuing to find innovative ways to improve our content moderation and data security systems,” she added.
The Center will open in early May, initially with a focus on moderation. Later, TikTok says it will open up for insight into its source code and efforts around data privacy and security. The second phase will be led by TikTok’s newly appointed Chief Information Security Officer, Roland Cloutier, who starts next month.
The company notes it has taken many steps to ensure its business can continue to operate in the U.S. This includes the release of its new Community Guidelines and the publishing of its first Transparency Report a few months ago. TikTok has also hired a global General Counsel and expanded its Trust & Safety hubs in the U.S., Ireland, and Singapore, it said.
from Social – TechCrunch https://ift.tt/2Q6fhjK
via IFTTT
0 comments:
Post a Comment