Sarah Jeong’s short book The Internet of Garbage is very well done, and rather sobering, and I recommend it to you. The argument of the book goes something like this:
1) Human societies produce garbage.
2) Properly-functioning human societies develop ways of disposing of garbage, lest it choke out, or make inaccessible, all the things we value.
3) In the digital realm, the primary form of garbage for many years was spam — but spam has effectively been dealt with. Spammers still spam, but their efforts rarely reach us anymore: and in this respect the difference between now and fifteen years ago is immense.
And then, the main thrust of the argument:
4) Today, the primary form of garbage on the internet is harassment, abuse. And yet little progress is being made by social media companies on that score. Can’t we learn something from the victorious war against spam?
Patterning harassment directly after anti-spam is not the answer, but there are obvious parallels. The real question to ask here is, Why haven’t these parallels been explored yet? Anti-spam is huge, and the state of the spam/anti-spam war is deeply advanced. It’s an entrenched industry with specialized engineers and massive research and development. Tech industries are certainly not spending billions of dollars on anti-harassment. Why is anti-harassment so far behind?
(One possibility Jeong explores without committing to it: “If harassment disproportionately impacts women, then spam disproportionately impacts men — what with the ads for Viagra, penis size enhancers, and mail-order brides. And a quick glance at any history of the early Internet would reveal that the architecture was driven heavily by male engineers.” Surely this is a significant part of the story.)
Finally:
5) The problem of harassment can only be seriously addressed with a twofold approach: “professional, expert moderation entwined with technical solutions.”
After following Jeong’s research and reflections on it, I can’t help thinking that the second of these recommendations is more likely to be followed than the first one. “The basic code of a product can encourage, discourage, or even prevent the proliferation of garbage,” and code is more easily changed in this respect than the hiring priorities of a large organization. Thus:
Low investment in the problem of garbage is why Facebook and Instagram keep accidentally banning pictures of breastfeeding mothers or failing to delete death threats. Placing user safety in the hands of low-paid contractors under a great deal of pressure to perform as quickly as possible is not an ethical outcome for either the user or the contractor. While industry sources have assured me that the financial support and resources for user trust and safety is increasing at social media companies, I see little to no evidence of competent integration with the technical side, nor the kind of research and development expenditure that is considered normal for anti-spam.
I too see little evidence that harassment and abuse of women (and minorities, especially black people) is a matter of serious concern to the big social-media companies. That really, really needs to change.
1 Comments
Comments are closed.
Look at campus. Here at Purdue, I know that class is a place, often, where people have to be very careful about how the speak for fear of offending someone and triggering formal repercussions. Meanwhile, I can't ride my bike down Slayter Hill on a Saturday afternoon without frat boys shouting homophobic slurs at me from their porches. That's sort of like the same situation on the internet.