David Thornburg (Comment 11) introduces some historical perspective and argues a radical rejectionist position that there is nothing new in "web 2.0" and he cites a number of web 2.0 technologies that have historical precursors:
He goes on to point out that there is a difference between qualitative change (truly new stuff) and quantitative change ("web 2.0")Blogging? Oh, you mean “bulletin boards?” These were wildly popular when Marc Andreeson was in elementary school.
Oh, I know, Second Life! Yesiree. That has Web 2.0 written all over it. Except that Neal Stephenson wrote all about it in Snowcrash, published in 1992. His vision of a parallel virtual world became reality a few years later with a program called the Palace that kids all over the world were using to create virtual worlds ...
Although he is correct in a technical sense (a point acknowledged in the discussion and reinforced by Tom Hoffman, Comment 20: we should use language correctly, web 2.0 is hyped) I don't think he wins the debate overall
Chris Lehmann (Comment 13) and Andy Carvin (Comment 15) challenge David's qualitative / quantitative distinction in pointing out that a lot of incremental changes (even if not representing true innovation) do eventually add up to a qualitative change in the overall environment. I agree with this dialectic: sufficient quantitative change can lead into qualitative change.
A brief summary of some of the points they and others made:
hardware: broadband explosion, cheaper storage
software: easier to use applications, RSS matters for keeping track, tagging / folksonomies, open APIs
social: more user generated content: eg. blogs, wikis, photos, podcasts, video, more community
economic: low to zero barriers of entry to publish
I liked the way Mike Guerana (comment 46) summed it up:
The new iterations of the “old” web technologies have been amazing evolutions in being able to receive information that is more customized and personal. I like the fact that I can use Google Reader for RSS and not have to visit each page throughout the day. I don’t care what it is called ...OK. David Thornburg made a bold discussion point and there was a good response from some authorities and participants in the "web 2.0" community.
But historical analysis has many perspectives. There are other good critical discussion points about "web 2.0" that could have been made but were not. IMO, David Thornburg was making a technical-historical point, that web2 technologies have pre-web precursors. I think it might be more important to make the educational-historical point that we have seen great computer based educational technologies in the past - logo is a good example - that have now faded so much from view that many in the "web 2.0" community may not have even heard about them or the educational philosophies of their advocates (Papert, Harvey, Kay, Stager etc.)
Sylvia Martinez elaborates on this other history in web 20 and historical perspectives:
Furthermore, within the thread and within the web 2.0 community some things are said, by intelligent people, that I think are wrong / dangerous. For instance:Right now the concept of Web 2.0 in schools is in the hands of excited educators who have felt the power of learning something new and want to share it with their students and other educators. It’s a contagious, revolutionary feeling that we are on the cusp of something that will change the world.
This feels SO much like the 80’s, when computers first started trickling into schools. But the dark side is how schools, instead of letting educators show the way, turned to corporations and publishers to commercialize and pre-package the computer into school-friendly forms. It deprived students and teachers of authentic chances to program, to make music, and to create. Instead of the revolution in learning that seemed to be ever so temptingly on a permanent horizon, it turned computers into test prep machines that reinforced the way school “delivered” information to students.
The score: Technology - 0, “School” - 1
Miguel Guhlin (comment 42): "I don’t have to worry about being an expert since there are so many people out there who are experts…I can rely on you"
David Weinberger has written a book with this title: "Everything is Miscelleous"
Andrew Keen, a web 2.0 critic, said in his debate with David (video link) that the book was good but he could never agree with the title. I agree, the title is very bad. (I haven't read the book)
Leigh Blackall in response to my earlier post said he relied on trusted experts: "my Youtube experience is made up of recommendations from experts I already trust".
This argument reinforces my belief that expertise is another important question that "web 2.0" needs to address. When experts argue then we begin to make progress. We should not be trusting experts but building environments where they are encouraged to argue.
Overall the debate on David Warlick's blog was great because experts argued on issues of importance, many of the comments were of high quality and their was passion and involvement.
But web 2.0 is not always like this. One of Seymour Papert's learning principles espoused long ago in The Children's Machine (1992) is that, "a good conversation enhances learning". Well, how about a bad conversation? And aren't there many bad conversations on web 2.0 forums? As I said in an earlier blog:
Global village idiocy, banalisation, hive mind, self censorship and chasing popularity are all real problems.As web 2.0 scales this problem may well become worse. eg. at the moment most teachers who blog do have things of interest to say, they tend to be movers and shakers who want to change the system. Perhaps those clammering for more and more of web 2.0 ought to reflect a little on what normally happens to technologies once they enter the mainstream.
4 comments:
Hi Bill,
It's good to analyze critically and thoughtfully. Too bad blogs are such a lousy way to have a conversation. Something needs to be invented soon to help this mishmash of tracking systems. (Web 2.1?)
By the way, Americans have an aversion to confrontation and argument. I hope a more global perspective will help us see how critical argument and dissection of views can be constructive.
I'd like to hear more about "expertise" and how we figure that out. It used to be easier, with peer-reviewed papers, degrees and such. Or maybe that is all just lack of historical memory as well. For example, I've seen it argued that Uncle Tom's Cabin ended slavery in the US by telling a simple human story. All the research, important people making speeches, etc. did not do that, and had relatively little impact. Or did it plow the field so that seed could grow?
Is expertise valued? Has it ever been? What is an expert? How do you know? Can we ever find a happy medium between simple ideas that hit people where they live and deeper thought?
hi sylvia,
Thanks for the story about Uncle Tom's Cabin and your questions. They are big questions, I'll try to write a separate blog in response
I did write some comments earlier about truth on Graham Wegner's blog. And on my own blog, truth seeking environments
I've left comments on Leigh Blackall's response and on Miguel Guhlin's response
Bill, I agree with Sylvia's comments about how you reported on the conversation taking place on my blog, kind of a meta-versation. ;-)
Forgive me.
She is also right about our (U.S.) aversion to confrontation -- especially here in the south. We are made of softer sentimentalities.
That said, I wanted to point you and your readers to my more formal response to Thornburg's arguments -- In Response...
Post a Comment