4) And these activities will produce greater achievements in the future, thanks to the power of the
: eventually quantity simply produces quality. So — to use the example from Shirky’s book that has drawn the most attention — go ahead and play with your lolcats, because by so doing you are making your tiny but fundamentally “generous”and “creative” contribution to the hive mind.
It’s the last point that’s the key. An example often given to illustrate the wisdom of crowds is this: bring an ox into the village square, get everyone in the village to guess its weight, average their guesses, and you’ll end up with something closer to the correct number than any one expert is likely to produce. (Sometimes the illustration involves guessing the number of marbles in a jar, or something like that — you get the idea.) Shirky is banking heavily on this principle being operative in every aspect of human culture, in such a way that while any one person’s contribution to any particular endeavor may have infinitesimal value, economies of scale mean that the total achievements of the crowd taken as a whole will be vast.But is this true? Are there really no limits to the wisdom of crowds? And will economies of scale really take care of everything? Well, Jaron Lanier — whose book
You Are Not a Gadget, while it came out before Shirky’s, is a kind of refutation of Shirky’s key claims — isn’t buying it. There are some problems that crowds just don’t have the wisdom to solve, because some problems call for
expertise. “It seems to me,” Lanier writes, “that even if we could network all the potential aliens in the galaxy — quadrillions of them, perhaps — and get each of them to contribute some seconds to a physics wiki, we would not replicate the achievements of even one mediocre physicist, let alone a great one.”Lanier points out that the go-to example for celebrants of such wisdom is Wikipedia, but the (this is my point, not Lanier’s) only area in which Wikipedia provides anything that can’t be found elsewhere is popular culture, and even in that case all it gives us is lists and summaries. Really cool lists and summaries, in many cases — how about
the names of all the spaceships in Iain M. Banks’s Culture novels? — but nothing that creates or innovates. (In fact, creation, innovation, and “original research” are forbidden on Wikipedia.) In the sciences, Wikipedia simply copies and pastes what has been discovered elsewhere and can be found elsewhere; and in the humanities, the less said the better. Much is taken verbatim from public-domain print dictionaries. There’s a funny moment in
You Are Not a Gadget when Lanier imagines his 1980s self looking into the future to discover the great achievements of the hive mind, only to discover that they are a variant of Unix (Linux) and an encyclopedia. That’s
it? That’s the best you got?And in the meantime, what becomes of the person who is devoting most of her cognitive surplus to making lolcat captions, because Clay Shirky told her that by doing so she’s doing her part for the hive mind? That’s the question that leads us back to my own story — but that will require at least one more post.
Let me go contra, today.
Like yourself, as a child my wife was a voracious reader, and as a connected adult, has been an enthusiastic participant in online fan culture, especially reading and writing fanfic.
Yeah, I know, ugh, fan fic. But Sturgeon's Law holds there as well; and amidst the teenibopper squee, there's a pretty vibrant creative and critical community, every bit as sophisticated as anything I've read in the NYT Sunday magazine (okay, maybe not a strong statement, but you get my drift.)
I do take issue with Shirky's favoring LOLcats over TV. TV's been everybody's favorite whipping boy my whole life, and the most of the criticisms I see are pretty damn lazy ("I watched once with the sound off while I was doing something else and it's obviously toxic") Conversely, I think the word and the concept of "creativity" have suffered terrible overuse and abuse. So Shirky's presumption that making Lolcats is better than watching Gilligan's Island (or drinking gin) rubs me the wrong way. (And I say this as lover of both Lolcats and Gilligan's Island.)
What I'd like to hear is how all this cognitive surplus is going to be accessed. As an internet marketeer, the thing I've noticed most about the last 10 years is how much garbage data the internet produces; and that the methods developed to deal with garbage data are extremely crude, and almost never favor the marginal case.
I think this is important, because that's where the big bang for the buck innovation takes place — at the edge, or sometimes just beyond the edge of mainstream inquiry. That link I sent yesterday talked about ScienceBlog "taking up all the oxygen" due to the way information gets ordered on the internet, and it's certainly not the only example. It's an accelerating trend I see across the board.
I'm not worried about anyone making LolCats because Clay Shirky told her to, and I don't think that's what he's saying anyway. I think Shirky's pitch is two-fold:
The the general audience he's simply saying "All this technology is AWESOME! So go ahead, get that new computer, that new phone, that new tablet. You're participating in the next great human endeavor!"
To the bizz book/lecture circuit audience, he's saying "You know all that stuff your employees are doing when they are supposed to be making you money? Come close, and I'll whisper in your ear how you can turn all that 'cognitive capital' into gold."
In either case it's a pretty irresistible pitch. As it ever was.
We've debated the Lolcat thing endlessly in the comment threads here and I won't go into that again.
But I think there's something not quite right in your notion of "innovation" when it comes to Wikipedia.
the only area in which Wikipedia provides anything that can't be found elsewhere is popular culture
Not quite. If you're a programmer, Wikipedia is a wonderful resource, far superior to books. Data on all the states and cities in India? Yes! Data on so many cities here in the US? Yes again. I don't know how much of this is in other encyclopedias but I would suspect not. And I suspect that there are areas which we hardly ever use Wikipedia for and therefore don't even know that they exist.
But more importantly, isn't the fact that all this information (perhaps copied verbatim from other sources) was put online, in this quite complex hyper-linked system itself an innovation?
Finally, the crucial thing here — the innovation in Wikipedia really — is not so much how much of the information in Wikipedia is new but how much it gets used. Does it save someone somewhere some time in looking up something? Does it reduce the opportunity cost of finding information? I think the answer is yes although it is hard to quantify this but I suspect the small gains in productivity quickly add up.
Tyler Cowen said something related here.
Tony: I think a lot of fanfiction is genuinely creative, so what you say about your wife's community doesn't surprise me. And I don't mean to suggest that Shirky is going to have any influence over whether anyone else makes lolcats; rather, my point is that there's something unappealing about someone in the cultural elite telling people in the lower orders that they ought tobe happy right where they are. Western culture has been around that block a few times.
scritic: I'm happy to take your correction about Wikipedia's usefulness to programmers, and also to agree that there is a kind of innovation in its model of aggregation. But innovation in a deeper sense, genuine creativity, is forbidden by Wikipedia's "no original research" policy. It will therefore always and necessarily be radically dependent on research being done elsewhere, creativity happening elsewhere. So I agree with Lanier that while Wikipedia is a cool thing, it doesn't promise a future of major idea generation.
And yes, we save time by going to a Wikipedia page for whatever we're searching for — often easy to do, since it will appear near the top of search results (a couple of years ago it would always have been at the top, but Google seems to have made some adjustments to its algorithm). But as Lanier points out, often you can get a better context for the information, and other interesting related data, by going to the page that Wikipedia got the information from. So Wikipedia may save time, but also may encourage us to be content with a contextually limited, or decontextualized, version of what we need to know. And the Wikipedia page may not be as detailed or as accurate as the source, depending on the editing. So speed and ease aren't everything.
Also, I owe you a better response to some issues you raised a couple of weeks ago — I'll get to it eventually!
I posted today a contrast between Carr and Shirky. A bit of a different take, and not so much a critique. FWIW.
Tony Comstock: TV's been everybody's favorite whipping boy my whole life, and the most of the criticisms I see are pretty damn lazy ("I watched once with the sound off while I was doing something else and it's obviously toxic") Conversely, I think the word and the concept of "creativity" have suffered terrible overuse and abuse. So Shirky's presumption that making Lolcats is better than watching Gilligan's Island (or drinking gin) rubs me the wrong way.
I won't be as ungenerous as Mr. Comstock by accusing him of laziness as he did me. I don't wish to be drawn into a flame war. But seeing as how Text Patterns in general, and more specifically, the blog series about Shirky's techno-optimism, both imply — even demand — a critical assessment of the media we consume, I have no hesitation to balk at TV the same way Prof. Jacobs balks at the wisdom of crowds argument. Mr. Comstock disagrees about TV. That's fine. But to accuse me of intellectual laziness for failing to hijack the comments with a long diatribe about TV, that overreaches. I could certainly make those arguments if I felt them appropriate, and I would have a lot of company.
Adam, thanks for the info — it's good to know about your site.
In fact, I wrote up a response to some earlier thoughtful comments by scritic that will appear later this week.
Dear Brutus,
You're right. I should be more gracious.
I would like to take up Alan's point about expertise by posting a long quote from something he linked to from his @TextPatterns twitter feed:
"In the literary culture of the past few hundred years, novels dominate the landscape like a mountain range, but one that is even more impressive for its massive centrality than for the heights of its summits. Unquestionably some of the towering books of modern times were novels, but other peaks, more isolated but just as high, were thrust up by philosophy, poetry, history, economics, autobiography, psychoanalysis, anthropology, et cetera. So the eminence of the novel in literary culture owed nothing to any monopoly on greatness. It derived instead from the novel’s special status as a popular form, written by and for amateurs rather than scholars, that could nevertheless achieve true artistry, that could be at once “of the best” and “of the (middle-class) people.” Familiarity with good or great novels, even if there wasn’t so much as a handful of them that everybody had read, connected all literary or educated people into a society of book-readers.
"The inherent amateurishness of the novel, of its writers and implied readers alike, seems vital in this. Not that no authors relied on writing novels to make a living; obviously many did and do. But (as the fictional novelist Bill Gray remarked in Mao II) the novel was essentially a democratic form, and writing one a feat that potentially anybody could pull off at least once. Among the audience, even less expertise or specialization was required. To read a novel you had to be literate and to take an interest in life as it’s lived by individuals, and that was about it. The great novelistic subjects—manners, family, growing-up, alienation, friendship, nostalgia, running away, love—tended to be things everyone had experienced, feared, or fantasized about. The novel portrayed common elements of life in a way that could be commonly understood, something true even in the case of the more rebarbative texts of the avant-garde."
What I hear in Shirky's "cognitive surplus" or that Xerox "Paper Chase" commercial from a few years back, or the Apple wedding video from Tahiti ads and the rest of it is the promise that Amateurs Still Count, or even that In This New Technological Future Amateurs Count More Than Ever!
In a way, it's a strangely backward-looking POV; a hope that the wonderful era of Amateurism in Information and Culture is not over.
But look at Xerox's latest campaign. It's not about the end of gatekeepers; it's about how to manage all the data. And more and more I expect one of the bigger issues will be sorting mostly not very useful information from amateurs (lolcats from actually harmful information (spam, viruses) from information that is actually useful, most of which will still be created by experts.
My personal feeling is that due to the sheer volume of information being created, the machine-governed walls & gates that go up to facilitate this sorting process are going to be far less porous than the human-governed walls we've used to date; and that expertise and credentialism will become virtually indistinguishable.
Of course if I can't make the jump to this brave new world, I've already got my solar and wind-equipped sailboat ready to take me off the grid and into a floating paradise where I can be my own master!
Wikipedia is "not creative" in almost exactly the same way and for the same reasons that Google is not creative. Wikipedia is an index of information available elsewhere, compiled in a useful and convenient format.
The importance of Wikipedia is that it exemplifies the way the internet encourages people to be contributors to mass culture rather than simply consumers of mass culture.
It is the ways that LOLcats and Wikipedia have inspired millions (billions?) of people to see themselves as potential creators of mass culture that leads people like Shirky (I haven't read his stuff specifically) to think the internet will give us this great flowering of culture.
Only a very small percentage of what people create is really good and worthwhile, but if there are a thousand times more people creating stuff, the absolute amount of really good stuff may not quite increase a thousandfold, but there's surely going to be a whole lot more of it.
And that's only considering things from the consumer's point of view. If a thousandfold increase in the people creating and publishing music leads to even a mere tenfold increase in stuff that I enjoy, that's an enormous win for my personal enjoyment. But I'd also argue there's an intrinsic benefit to each of those musicians if they go from listening to playing, or if they go from playing privately to polishing their art enough to release a recording or a video on Youtube.
Only a very small percentage of what people create is really good and worthwhile, but if there are a thousand times more people creating stuff, the absolute amount of really good stuff may not quite increase a thousandfold, but there's surely going to be a whole lot more of it.
That's the Shirky thesis, all right, but I think it all depends on what those people are making. Shirky seems to think there's a tipping point at which quantity becomes quality, but I think that if you multiple mindlessness by a hundred or a thousand times it won't magically transform into mindfulness.
If a thousandfold increase in the people creating and publishing music leads to even a mere tenfold increase in stuff that I enjoy, that's an enormous win for my personal enjoyment.
Though this point seems eminently sensible, I wonder even about it. (I guess I am in a skeptical mood today.) The assumption here seems to be that there is currently a scarcity of good music — but is there? I already have more music on my computer than I can listen to, and I wonder if I wouldn't be better off with fewer recordings that I could get to know better. Could I handle a "tenfold increase in stuff I enjoy"? I'm sure I could not.
The other assumption of that claim is that is the music itself is scarce the talent to make music is abundant and underutilized, and I don't think I buy that either!
I'm not absolutely opposed to any of these claims — really, I'm not — but I think they need more evidence in support of them.
RE: Music
I would invite readers and commenter to consider the effect of technology — amplification, duplication, distribution — on ensemble size and instrumentation; and the effect of ensemble size and instrumentation on the sort of music that are composed and performed.
Tony, Lanier's book has some good stuff on that — he has a particular interest in music, in terms of performance, recording, distribution, etc.
I guess it partly depends on whether you believe the step from "I read stuff" to "I write stuff" (or from "I listen to music" to "I play music") is one of the crucial steps in becoming an artist. It seems pretty clear to me that the internet is encouraging a lot of people to make that step.
Maybe you think that's a trivial step for anyone who has genuine talent? That anyone with talent would have taken that step without the internet's encouragement?
I can offer one anecdote. One of my very favorite musicians, a guy whose songs I love so much I learn to play them and sing them myself, quit his 9 to 5 job as programmer and started recording and publishing music because of this very phenomenon. He released a few songs on the internet, they found an audience, and he decided to make a career of it. He doesn't have a publisher or a record contract. I too, have more good music than I can listen to, but I value his more than most of it.
On the other hand, I think the part of my comment that you didn't address, the intrinsic benefit to an individual of taking the step from consumer to creator, is even more important than this hypothetical multiplication of humanity's library of great art. But I think a defense of that assertion would be a pretty ambitious task, akin to defending the intrinsic value of reading.
Bach when I used to make films about God I occasionally had the money to commission scores. Of course the reason I could afford to commission score is because talented, young, hungry composers are a dime a dozen — but as poor as they are, even 15 years ago they could afford some pretty amazing synthetic music set-ups that would let us do some pretty lush orchestration (that would have been waaaaaaaaay out of reach if we had to pay for session musicians and studio time).
But I learned pretty quick that, due to the technology used, no matter how different the arrangement, synthetic scores all have a similar sound. I'd liken it to the way that all restaurant food tastes like restaurant food, no matter what cuisine.
Anyway, what I took to doing was insisting that at least one real instrument be used in a prominent way in the orchestration. At the time guitar was a good choice because sync guitar was beyond awful.
Anyway, tech is really good at doing what tech is really good at doing. It's not so good at doing what it's not so good at doing. Hammer, nail, blah blah blah.
Michael, if your friend can make living that way, that's wonderful, and very encouraging. But the onlymusician I know who's doing that is Jonathan Coulter, which puts your friend in very select company indeed. I'm sure there are more than two, but the numbers aren't encouraging at this point. Maybe that will change over time. I know I keep referring to Jaron Lanier, but he's got some great stuff in his new book about the economics of music today.
On the other hand, I think the part of my comment that you didn't address, the intrinsic benefit to an individual of taking the step from consumer to creator, is even more important than this hypothetical multiplication of humanity's library of great art.
Well, I thought I was addressing it by asking, "Creating what?" Which I think matters. (Tony's most recent comments suggests just one of the ways in which it matters.) But just a few days ago you were arguing that "consuming" — i.e. reading — a book could be more valuable in several ways that "making" any number of things, so I thought you were on my side in this case! In any event, I'm quoting you on that point in a post that's going up in a day or two, so I hope you haven't changed your mind. . . .
As a musician, I know more than a little about the economics of music. The application of modern technology is definitely a double-edged sword. For every few success stories of musicians who have found an audience giving away their work, there are dozens more who have been put out of work by diminished demand for live performance. The Broadway orchestra pit has shrunk from 25 or more live musicians to only a handful supported by a playalong track or virtual orchestra. Many weddings bands have lost nearly all their bookings to DJs (who function not as musicians but as party hosts). Dance bands and theater orchestras scarcely exist anymore. The anecdotes pile pretty high — far higher than a blog comment box can tolerate.
Alan, my point was not at all about the economics, but just a pre-emptive example for my argument that the internet does encourage talented people to take the plunge and share their talent, not just provide a forum for talentless hacks.
And I'm not arguing that "creating" is better than any and all "consuming" (we need better, less prejudicial labels these concepts in this conversation). I'm just saying that that it's good for people to also be creators rather than exclusively consumers.
Brutus, the wedding band thing is surely more about changing tastes in entertainment than technology? DJ's have been around for a long time.
I've played in dance bands, wedding bands, opera orchestra pits, Broadway show pits, and ballet pits. It's economics abetted by technology driving live performance down. Chalking it up to changing tastes is just the tail wagging the dog. Besides, ask anyone with any taste and he or she will say it's far preferable to have a live band to dance to at the wedding, prom, or convention. It's simply a lot more fun. But the economics of hiring a 20-piece band vs. a DJ with a microphone and volume knob is hard to argue with.
Michael, try running the question through a "defining deviancy down" POV. As recorded music become more and more acceptable in places that traditionally would have employed live musicians (pit orchestra, hotel lobby,) it become less and less unacceptable in the marginal cases.
Brutus, back when we had an office space in Manhattan, the company making that orchestra-in-a-box thingo was across the hall. I remember how proud they were of the "vamp" and manual tempo (tap tap tap tap) features. That was ten years ago, no doubt they've been
replaced by something half the cost, and the market has expanded with a machine 1/2 quality, but 1/10th cost.
RE: Taste
People who don't work in the creative professions often ascribe taste as an explanation of phenomena that have economic reasons. Just yesterday I heard a comic artist explaining the palette of his early work to a (disappointed?) caller. Not a bold artistic stroke for a muted tri-tone. All about money. As the artist said, "I would have loved to be able to do the whole thing in color, but we were lucky to have the money to be able to add the blue ink! "
As recorded music become more and more acceptable in places that traditionally would have employed live musicians (pit orchestra, hotel lobby,) it become less and less unacceptable in the marginal cases.
Sure, but I don't really see that as a result of technological changes. Maybe I'm just ignorant. Has technology really made DJs significantly cheaper or better over the last 30 years?
I think I've bungled this point. Let me try again.
Genearlly speaking speak people prefer what they are familiar with. As the sound of live music has been replaced by recorded music, people's ears have become less appreciative/tolerant of the vagaries of live performance. For a market trained away from live music, the choice is painfully obvious; an expensive wedding coverband playing second-rate versions of their favorite hits; or an inexpensive DJ playing the "real" thing.
Entended: as compression encoded music has become more pervasive, people have become less appreciative/tolerant of the sound of uncompressed digital and analog recordings; to the point that live sound engineers at some venues are introducing on the fly compression into live mixes before pumping them out through the PA system, and the amount of compression on recorded tracks is now an aethetic element, with more compression than technically needed to address bandwidth concerns sometimes being used for the sake of "the sound".
And then there's labiaplasty, but that's more censorship and economics driving "taste" rather than technology and economics driving taste.
It's true that Wikipedia presents no new content, but it does collect, summarize, and organize existing information in novel ways (and with varying degrees of success), which is not going to directly produce the theory of relativity but which can make that existing theory more accessible to others. (I'm not vouching for the current article on the subject, you understand.)
I also don't mind confessing that Wikipedia's vast audience has inspired me to dedicate many hours to seriously researching articles I've worked on. I know that for every wording tweak from anonymous users that I see in the (non-obscure) articles I've worked on, there are hundreds or thousands of others reading it. Even though I contributed anonymously, how many people can say they have such an audience?
Connected to this, another of my motivations is to shape opinions, particularly in the church. Sometimes the opinions that have been reshaped are my own.
An example of the latter: I worked on some articles on controversial theological subjects in conjunction with opponents of my view, and their iron sharpened mine. In one case, on a constellation of related topics, I came to a new understanding of my opponent's view, and how close the two camps really were. That insight came by working under the neutrality guidelines imposed by the Wikipedia (something you wouldn't necessarily get at other partisan Wikipedia copycats or from a book editor at a publishing house friendly to your view). The article had to sound right to both of us in order for us to leave it be. In the course of our editing and debating, straw men were knocked down, and the real (but not-as-great-as-I-had-been-taught-and-read-about) differences were laid bare. It was a great experience for me and my friend-opponent.
Also with a view toward shaping opinions, I researched and wrote on a topic with some thoroughness in order to disprove the (often unintentionally) fallacious position of some Christians on a particular matter. I wasn't writing anything new — just collecting information already publicly available, often in more scholarly venues — but I was making it accessible and more visible to Christians and the public at large. I was practically guaranteed a larger audience than writing the same material up on a blog, and I had built-in helpers to copy edit, fact-check, etc.
With the advent of more children, I have had to shift my efforts elsewhere, so I can't say the articles I once labored over are still in the condition in which I left them. But I still keep watch over a couple that I spent the most time on.
It's true that Wikipedia presents no new content, but it does collect, summarize, and organize existing information in novel ways (and with varying degrees of success), which is not going to directly produce the theory of relativity but which can make that existing theory more accessible to others. (I'm not vouching for the current article on the subject, you understand.)
I also don't mind confessing that Wikipedia's vast audience has inspired me to dedicate many hours to seriously researching articles I've worked on. I know that for every wording tweak from anonymous users that I see in the (non-obscure) articles I've worked on, there are hundreds or thousands of others reading it. Even though I contributed anonymously, how many people can say they have such an audience?
Connected to this, another of my motivations is to shape opinions, particularly in the church. Sometimes the opinions that have been reshaped are my own.
(cont.)
(cont.)
An example of the latter: I worked on some articles on controversial theological subjects in conjunction with opponents of my view, and their iron sharpened mine. In one case, on a constellation of related topics, I came to a new understanding of my opponent's view, and how close the two camps really were. That insight came by working under the neutrality guidelines imposed by the Wikipedia (something you wouldn't necessarily get at other partisan Wikipedia copycats or from a book editor at a publishing house friendly to your view). The article had to sound right to both of us in order for us to leave it be. In the course of our editing and debating, straw men were knocked down, and the real (but not-as-great-as-I-had-been-taught-and-read-about) differences were laid bare. It was a great experience for me and my friend-opponent.
Also with a view toward shaping opinions, I researched and wrote on a topic with some thoroughness in order to disprove the (often unintentionally) fallacious position of some Christians on a particular matter. I wasn't writing anything new — just collecting information already publicly available, often in more scholarly venues — but I was making it accessible and more visible to Christians and the public at large. I was practically guaranteed a larger audience than writing the same material up on a blog, and I had built-in helpers to copy edit, fact-check, etc.
With the advent of more children, I have had to shift my efforts elsewhere, so I can't say the articles I once labored over are still in the condition in which I left them. But I still keep watch over a couple that I spent the most time on.
Sorry for the double post. I told me my comment didn't go through because of size. Feel free to delete as you deem appropriate.