For obvious reasons, Trump doesn’t have a TikTok account. But the President’s speeches that helped incite the mob who yesterday stormed the U.S. Capitol will have no home on TikTok’s platform. The company confirmed to TechCrunch its content policy around the Capitol riots will see it removing videos of Trump’s speeches to supporters. It will also redirect specific hashtags used by rioters, like #stormthecapitol and #patriotparty, to reduce their content’s visibility in the app.
TikTok says that Trump’s speeches, where the President again reiterated claims of a fraudulent election, are being removed on the grounds that they violate the company’s misinformation policy. That policy defines misinformation as content that is inaccurate or false. And it explains that while TikTok encourages people to have respectful conversations on subjects that matter to them, it doesn’t permit misinformation that can cause harm to individuals, their community or the larger public.
A rioting mob intent on stopping democratic processes in the United States seems to fit squarely under that policy.
However, TikTok says it will allow what it calls “counter speech” against the Trump videos. This is a form of speech that’s often used to fight misinformation, where the creator presents the factual information or disputes the claims being made in another video. TikTok in November had allowed counter speech in response to claims from Trump supporters that the election was “rigged,” even while it blocked top hashtags that were used to promote these ideas.
In the case of Trump’s speeches, TikTok will allow a user to, for example, use the green screen effect to comment on the speech — unless those comments support the riots.
In addition, TikTok is allowing some videos of the violence that took place at the Capitol to remain. For example, if the video condemns the violence or originates from a news organization, it may be allowed. TikTok is also applying its recently launched opt-in viewing screens on “newsworthy” content that may depict graphic violence.
These screens, announced in December, appear on top of videos some viewers may find graphic or distressing. Videos with the screens applied are already eligible for TikTok’s main “For You” feed, but may not be prohibited. When viewer encounters a screen, they can just tap a button to skip the video or they can choose to “watch anyway.” (It could not provide any example of the screens in use, however.)
Anecdotally, we saw videos that showed the woman who was shot and killed yesterday appear on TikTok and then quickly disappear. But those we came across were from individual users, not news organizations. They were also not really condemning the riot — they were just direct video footage. It’s unclear if the specific videos we saw were those that TikTok itself censored or if the user chose to remove them instead.
Separately from graphic content, TikTok says it will remove videos that seek to incite, glorify, or promote violence, as those also violate its Community Guidelines. In these cases, the videos will be removed as TikTok identifies them — either via automation or user reporting.
And, as it did in November, TikTok is proactively blocking hashtags to reduce content’s visibility. It’s now blocking tags like #stormthecapitol and #patriotparty among others, and redirects those queries to its Community Guidelines. There are currently redirections across dozens of variations of those hashtags and others. The company doesn’t share its full list in order to protect its safeguards, it says.
TikTok had previously blocked tags like #stopthesteal and #QAnon, in a similar proactive manner.
We should point out that for all Twitter’s posturing about safety and moderation, it allowed Trump to return to its app, after a few key tweets were deleted. And it has yet to block hashtags associated with false claims, like #stopthesteal, which continues to work today. Facebook, on the other hand, banned Trump from Facebook and Instagram for at least two weeks. Like TikTok, it had previously blocked the #stopthesteal and #sharpiegate hashtags with a messages about its Community Standards. (Today those searches are erroring out with messages that say “This Page Isn’t Available Right Now,” we noticed.)
TikTok’s content moderation efforts have been fairly stringent in comparison with other social networks, as it regularly hides, downranks, and removes users’ posts. But it’s also been accused of engaging in “censorship” by those who believe it’s being too aggressive about newsworthy content.
That’s led to users finding more creative ways to keep their videos from being banned — like using misspellings, coded language or clever editing to route around TikTok policies. Other times, creators will simply give up and direct viewers to their Instagram where their content is backed up and less policed.
“Hateful behavior and violence have no place on TikTok,” a TikTok spokesperson told TechCrunch, when we asked for a statement on the Capitol events. “Content or accounts that seek to incite, glorify, or promote violence violate our Community Guidelines and will be removed,” they added.
from Social – TechCrunch https://ift.tt/3s1M36R
via IFTTT
0 comments:
Post a Comment