The moderators who sift through the toxic detritus of social media have gained the spotlight recently, but they’ve been important for far longer — longer than internet giants would like you to know. In her new book “Behind the Screen,” UCLA’s Sarah Roberts illuminates the history of this scrupulously hidden workforce and the many forms the job takes.
It is after all people who look at every heinous image, racist diatribe, and porn clip that gets uploaded to Facebook, YouTube, and every other platform — people who are often paid like dirt, treated like parts, then disposed of like trash when worn out. And they’ve been doing it for a long time.
True to her academic roots, Roberts lays out the thesis of the book clearly in the introduction, explaining that although content moderators or the companies that employ them may occasionally surface in discussions, the job has been systematically obscured from sight.
The work they do, the conditions under which they do it, and for whose benefit are largely imperceptible to the users of the platforms who pay for and rely upon this labor. In fact, this invisibility is by design.
Roberts, an assistant professor of information studies at UCLA, has been looking into this industry for the better part of a decade, and this book is the culmination of her efforts to document it. While it is not the final word on the topic — no academic would suggest their work was — it is an eye-opening account, engagingly written, and not at all the tour of horrors you may reasonably expect it to be.
After reading the book, I talked with Roberts about the process of researching and writing it. As an academic and tech outsider, she was not writing from personal experience or even commenting on the tech itself, but found that she had to essentially invent a new area of research from scratch spanning tech, global labor, and sociocultural norms.
“Opacity, obfuscation, and general unwillingness”
“To take you back to 2010 when I started this work, there was literally no academic research on this topic,” Roberts said. “That’s unusual for a grad student, and actually something that made me feel insecure — like maybe this isn’t a thing, maybe no one cares.”
That turned out not to be the case, of course. But the practices we read about with horror, of low-wage workers grinding through endless queues of content from child abuse to terrorist attacks, while they’ve been in place for years and years, have been successfully moderated out of existence by the companies that employ them. But recent events have changed that.
“A number of factors are coalescing to make the public more receptive to this kind of work,” she explained. “Average social media users, just regular people, are becoming more sophisticated about their use, and questioning the integration of those kinds of tools and media in their everyday life. And certainly there were a few key political situations where social media was implicated. Those were a driving force behind the people asking, do I actually know what I’m using? Do I know whether or how I’m being manipulated? How do the things I see on my screen actually get there?”
A handful of reports over the years, like Casey Newton’s in the Verge recently, also pierced the curtain behind which tech firms carefully and repeatedly hid this unrewarding yet essential work. At some point the cat was simply out of the bag. But few people recognized it for what it was.
from Social – TechCrunch https://ift.tt/2L7q9dG
via IFTTT
0 comments:
Post a Comment