Should we even try to teach programming? I have met hundreds of programmers in the last 30 years and can see no discernible influence of programming on their general ability to think well or to take an enlightened stance on human knowledge. If anything, the opposite is true. Expert knowledge often remains rooted in the environments in which it was first learned--and most metaphorical extensions result in misleading analogies. A remarkable number off artists, scientists, philosophers are quite dull outside of their specialty (and one suspects within it as well). The first siren's song we need to be wary of is the one that promises a connection between an interesting pursuit and interesting thoughts. The music is not in the piano, and it is possible to graduate Julliard without finding or feeling it.This has been niggling away at me in the background and in some of my dialogues with others for some time now
- alan kay, The Early History of Smalltalk
I have on occasions argued that programming skills in their own right do stand for "higher order thinking" (a phrase I'm no longer happy with) and deserve their own place in the sun.
And I have argued against the Victorian VELS (integration of computing into the curriculum) on the grounds that young students need exposure to skilled computing teachers and not the English teacher who sits out the front marking while kids get on with their word processing (and playing games they have smuggled in)
But the above quote from alan kay and his general emphasis that the main important thing is the non univerals does force me to look again at these stances
I wasn't wrong about the underlying problems of computer integration. I wasn't wrong that programming skills are in some hard to define way, "advanced". It's not so much that my arguments were wrong but that I didn't have a sufficiently firm grasp of the big picture, that the non universal powerful ideas really is the main issue.
This probably makes school reform harder still. Because the ways Schools generally teach maths and science is not powerful either. I've rocked the boat in science faculties in the past and they didn't like it. And teaching maths on the computer using logo? Yes, you can dabble with that in the middle school if you want to, but there just isn't time for it in the senior curriculum.
Programming can be powerful but unless we get the context and environment right then any gains will be much less than what could have been achieved. As Alan Kay points out many good programmers don't inspire in a more general sense. The harmony is not in the piano. logo or game maker is not weeties for the brain.
Although some of my own history is good it is easy to lose the way, to lose track of the powerful ideas, in the complexity and inflexibility of School
The reason I first decided to learn to program is that I read parts of a cult book, Godel, Escher, Bach by Doug Hofstadter (230 amazon customer reviews) and decided the only way I could possibly understand recursion was to explore it through programming. Mindstorm by Seymour Papert (my review) was another seminal influence, which led me eventually to comp.lang.logo on USENET, which became a vital part of my learning environment for a few years. Good start. So I explored some logo, did fractals and other interesting stuff (ISDP, Quadratics, immersion)
But 20+ years later I'm teaching year 9 specialist computer classes basic IT skills like file management and word processing in a computer lab. I'm bored with it, slipping into darkness.
Of course, I've been doing other interesting stuff along the way, like game making, but overall I've taken my eye off the main game -->> the powerful ideas / non universals. I don't see any easy answers for this in the context of School. But we need to talk about it.
11 comments:
I don't see any easy answers either, Bill. But, I do remember a fellow manager (who is from a "real" engineering background, he says) stating that programming is not engineering - and shouldn't be taught as an Engineering discipline in schools. He contends that programming is merely a craft used as a means to an end. Not coming from an engineering school, but from music school, I see his point. But, I'm cynical :-).
When I float this idea to other "engineers" (be they mechanical, EE, or whatever) most scoff at the idea.
Hi Bill,
I'm kinda confused by the Alan Kay quote. Should we not teach programming because he finds programmers to be bad dinner party guests? I'm not sure what to make of that quote at all. Maybe I need more context.
Anyway, on the engineering subject, my undergrad degree is engineering, and I went to a traditional (at the time) school that focused on provding a broad engineering background, with mandatory courses in all major disciplines, from chemical engineering to power to mechanical. I'm glad I did that, even though those courses weren't "practical" they did leave me with a solid sense of how to solve problems. These days it's harder to find this type of school, they tend to ask students to specialize early on. I think it's a "prep for vocational/grad school" emphasis that's misplaced.
At that time, computer science was taught out of the math department, and they focused on compiler construction and things like search algorithms. The eng students always joked that the math majors knew how to search for answers, but we could find them.
We learned programming simply because it was required to solve problems in various disciplines. So that's similar to what Brad's friend is saying, and I can see his point.
However, in K-12 education, it's just a shame that we require so much crap and for most kids, NO programming. Why the imbalance? We've messed the science and math curriculum up so badly, as you say, so that it's little more than vocabulary tests and prep for imaginary college classes that most kids will never have. Of course there should be programming, just like there should be physics of the real world, or math that was invented after Newton.
I'm less interested in making kids think any certain way, and more interested in providing as many avenues as possible for kids that don't dead end in some predetermined job. I think we should teach programming -- and art, and photography, and math, and psychology, and ecology, and history, and biology, and economics, and electronics, and sculpture...
I know it is nearly impossible to change such a globally entrenched system, and that's frustrating.
I agree with what Brad basically said that programming is not engineering. I think this is what people who use dynamic languages accept. That's the reason they want to use languages and systems that are "alive" and dynamic: They're easy to change. When you are in a non-engineering discipline that's what you need. Not that programming will never become an engineering discipline. I think it can and it should, but we don't have the knowledge yet to make that happen. Unfortunately there are a lot of people who don't know that.
As for programmers generally not being of the Enlightenment I agree with you. Quite a few have no sense of culture or history and feel they have no use for it. Further programming is a society of guilds that resembles the way people used to work a few hundred years ago. It has just naturally grown out of what the computer industry has become: a set of competing standards each vying for supremacy.
I don't know that it's anyone's fault this has happened. Alan Kay blames it on a "pop culture" that developed once personal computers became popular. I don't know if that's entirely accurate. I think it was more an outgrowth of the IT data processing mindset that's existed since the early part of the 20th century. Only now it has a prettier face on it thanks to the work Kay and others did at Xerox more than 30 years ago.
hi brad,
I suppose engineering is about building bridges that don't fall down and so it does curtail some imaginative and risky explorations
I quite like Brian Harvey's analysis where he compares two types of programming styles: the discipline of software engineering compared with the art of artificial intelligence
Another paper by Paul Graham argues that hackers and painters have a lot in common: "What hackers and painters have in common is that they're both makers"
I think what alan kay is saying is that we need to import elements from a broader world, a more general philosophical perspective, into the world of programming. That our initial perspective needs to be broader than a productivity requirement. My own experience has been that it's easy to slip back into productivity type thinking in a School and social system that continually exerts pressure in that direction.
hi sylvia,
I think section in the alan kay paper, the early history of smalltalk , before and after the part I quoted does provide quite a bit of extra context. I see this paper as essential reading for those of us who want to understand the history of computing and / or computers in education
The part I quoted was at the contrarian rhetorical edge of a broad discussion about mediums, tools, the use to design templates, metaphor, fluency and what is real literacy. I was tempted to put in a big slab quote but instead just blogged about one part that had given me a jolt and set me thinking about an aspect of my teaching that was starting to bore me, the teaching of routine IT skills.
In School context everything becomes a compromise between the ideal and the real. But I have told my school that next year I'd rather teach maths / science and integrate that with computing than just more of the IT skills type classes. In school context that will create different problems for me (lack of access to 'computer labs') but overall I now see it as a better way to go.
you write:
"I'm less interested in making kids think any certain way, and more interested in providing as many avenues as possible for kids that don't dead end in some predetermined job ..."
I think that's one aspect of good teaching. But I've also been thinking that alan's non universals list represents something that our formal education system ought to be teaching, somehow. How else will children acquire this knowledge?
hi mark,
you write:
"I don't know that it's anyone's fault this has happened. Alan Kay blames it on a "pop culture" that developed once personal computers became popular. I don't know if that's entirely accurate. I think it was more an outgrowth of the IT data processing mindset that's existed since the early part of the 20th century"
I think it is somebody's fault. A speculative connection between a murder and a Myspace account becomes front page news whilst alan kay stating that computer science hasn't developed in 30 years does not rate a mention in any mainstream media.
HI Bill,
I think your demand about your class structure is the right choice. As you say, school context becomes a compromise. I think "we" (reformers) give up very easily on changes since we know how tough change is, so we pre-compromise too often, and then the forces of evil simply demand more compromise.
I once knew a commercial artist who said he put a duck in every picture (not a real duck, just something weird). When the client said, "what's with the duck?" he would storm around in an artistic way and then eventually grudingly remove the duck.
His advice on negotiation was "always put in a duck" - so that's my advice to you on your proposal for your class next year. Don't propose a compromise you think might be accepted, go for what you really want.
@Bill:
I think it is somebody's fault. A speculative connection between a murder and a Myspace account becomes front page news whilst alan kay stating that computer science hasn't developed in 30 years does not rate a mention in any mainstream media.
Alan Kay is someone whose name has always been out in the ether. He'll show up in some media from time to time, but not very consistently. He doesn't get noticed very much except by those who are deep in the computer science community. That's my impression anyway. Even in the two videos I've seen of the original Smalltalk system, which he was intimately involved with, his name isn't even mentioned. He takes hard stands on some issues that only computer scientists and people involved with human cognition would understand, and that's probably the reason he gets noticed at all. Otherwise, he's very humble. In every presentation where I've seen him demonstrate Squeak, he's always sure to give time to others whose work he based it on, like Sutherland, Deutsch, and Engelbart, almost as if to say, "I don't really deserve much credit for this." So he doesn't really elicit people's attention, though I think it's very rewarding for those who do find him and pay attention.
Think of it this way. If Stallman had said the same thing about computing would the newspapers have noticed it?
Besides, computer science is a pretty obscure subject to most folks, though we're told it's important. Most people don't know what it is or what importance it has. In fact, if anyone in the vast populace can relate to it their first thought is probably of the dot-com crash and the subsequent (vastly overblown) stories of outsourcing to India, which immediately evokes a negative reaction. MySpace has more current social relevance in people's minds. I'm sure you can see that. One of your blog posts was on the "social divide" between people on MySpace and Facebook.
Re: who's to blame, I don't feel like I can put the blame on one individual or group, but more on a mindset. I've written some blog posts on the topic of the "data processing mindset" and "scientific computing", as blogger Paul Murphy defines them. "Scientific computing" in his lexicon is just another name for the "man-computer symbiosis" that Licklider talks about. The fact that these competing visions exist is interesting to me. I don't know if there's a formal name for it, but there's been an opposing philosophy that's been around for at least as long. One name for it would be "human replacement" (with machines). It's probably known by its more common name of "automation".
Paul Murphy has written about these competing visions occasionally. He usually writes about Unix (Solaris and Sun Ray displays, specifically). He believes that it's part of the "man-computer symbiosis" paradigm. IMO if it is, it's a compromise. Systems like the Apple Mac and Smalltalk are much closer to it. Yes, I know the Mac runs Unix now, but what Apple did was improve upon it, not conform to it. Apple has done yoeman's work in gradually introducing the ideas from the "symbiosis" camp to a wider audience and trying to popularize them. As you can see it's not that easy. The data processing mindset is pretty well entrenched, though the GUI concept which came from work done on symbiosis managed to break through. It's been subsumed by the web browser interface for the past 10+ years, which in a way is a recapitulation of the data processing mindset, though now it's become democratized in the sense that many more people can experience it, which has had some benefits. When it comes to actually delivering useful computing services it could be better.
One thing I will say for the web is it's been great for finding useful and enriching knowledge. I wouldn't have had the realizations I've had if it didn't exist (or something like it). so I am thankful for its good traits.
Murphy's most recent article on the competing visions I've mentioned is "Captain Cyborg and the Problem of Evil". A couple years ago he wrote a history of how these two visions developed. He says the "replacement" approach is the driving force behind what he's called the "data processing mindset", and the "symbiosis" approach is the driving force behind what he calls the "scientific computing" mindset. He endorses the latter.
Data processing and its related technologies are where most of the money is in computing. It's been this way for decades. There was an interesting interview with Alan Kay in ACM Queue from a few years ago where he talked about this at some length. Perhaps you've seen it through my blog. I wrote about it here, quoting quite a bit of the interview.
I have a "theory", if you will, that the reason the data processing mindset has persisted is it fits well with Industrial Age thinking. It believes that capital investment is supposed to go towards the machinery, not the employees. The Information Age is supposed to turn things around: capital is invested in developing the knowledge of employees. I pay a little attention to what's going on in business and so far I've only heard of one successful non-tech-focused company that has taken this approach. Unfortunately I can't remember the name. I vaguely remember it had something to do with farm equipment.
I contend we're still entering the Information Age, and I base that largely on the hiring practices companies use to find software developers, and the way they treat software projects. When they're hiring they act like they're hiring assembly line workers, line managers, electricians, and positions of that sort.
Microsoft and IBM are easy targets for blame when it comes to this subject, but what about all the customers and developers who swear by this approach? They're complicit in it as well. As for the latter two groups I have some compassion for them because I figure they don't know any better. This is what they're used to. And as Alan Kay says, in order to recognize the value of this other form of computing, people have to become acculturated in the kind of ideas that support it. So the solution is not a technological one. The ideas have to come first. That's where people like you come in. ;) I have my blog and I try to get these ideas out that way, but they don't always settle in fertile soil. As clear as I've tried to be I've seen people read my articles and still not get it. As Kay said in the interview I cited, computing, like TV, has met people where they are.
hi mark,
Thanks for explaining in more detail what you meant by the data processing mindset and contrasting it to an human augmentation or symbiotic human-computer relationship. The Paul Murphy articles were new to me.
This is one of the big philosophical questions raised by Alan Kay, originating from Doug Engelbart
You might be interested in the material on these pages of the learning evolved wiki:
enactivism , eg. the information about andy clark's book, 'Natural Born Cyborgs'
AI behaviourism approach from Rodney Brooks
Also the link on that page to a discussion / difference of opinion between Rodney Brooks and Ray Kurzweil. There are some quotes on the page putting forward the Brooks views which I think is better than the Kurzweil view.
One of Cory Doctorow's sci-fi books, has people routinely storing and later downloading their minds into new bodies. This goes right against the Brooks / Clark idea that cognition is distributed and mind is embodied and situated.
I think what you are saying is that it is really no ones fault that more advanced ideas are minority ideas and so its natural that they tend to not emerge into the mainstream. I suppose that's one underlying reason why progress happens by fits and starts, jerky or revolutionary change and not by gradual incremental evolution. That's about the connection between advanced ideas and how things change which is something I do mean to write about more. I suppose we needs leaders who are both advanced and popular - it is hard to find them.
@Bill:
I have seen sci-fi tales of people dying and their brains being put into machines so they become disembodied minds. That's more in the "human replacement" vein.
The movie that came to mind as I wrote my last response that dramatizes in an extreme way the battle between these two visions is The Matrix. You have people imprisoned inside machines, which give them an existence in a virtual world, but they feed off the energy generated by human bodies (which is something you have to suspend disbelief about). In the revolutionary force that has managed to escape, and is trying to overthrow it you have free humans who are awake and aware in the real world, who occasionally link themselves up with a virtual space or the Matrix itself, but are able to go in and out of those virtual worlds as they choose. One of the scenes has the main protagonist, Neo, downloading martial arts programs from a computer very quickly, and practicing with his mentor in a virtual dojo. What's interesting is the point is to develop one's mental abilities and to realize what the system really is. The first movie is the best one. The first sequel is kind of interesting, though you don't really discover that until the end. The last one completes the story, but is not that impressive.
@Bill:
Following up on your last paragraph, I did a bit of an analysis on Steve Jobs and Apple in a blog post I wrote called "Triumph of the Nerds", named after a PBS documentary series done by Robert X. Cringely that came out around 1996. Based on that documentary I talk about how the relationship developed between IBM and Microsoft around 1980, which led to "the deal of the century". It also covers how Steve Jobs discovered, and a bit how he acquired, the knowledge and technology to go forward with the Lisa and the Mac, starting in 1979. The latter is kind of revealing in how progress is made. Jobs didn't understand the full vision, but he understood it better than Xerox's own executives, and it was enough to make a marketable product out of it. Steve Jobs gradually came to understand more of the ideas. Fairly quickly after the Mac was introduced in 1984, they licensed laser printing technology from Adobe, which was founded by a Xerox PARC alumnus who had figured out how to drive laser printers using Postscript. Laser printing was invented at Xerox. Shortly after leaving Apple in 1985/86 Jobs founded NeXT where they developed NeXTStep, the UI for their systems, using Objective C, an OOP language that runs on a VM. Obj-C was the primary development environment for the NeXT. I've heard that it kind of looks and acts like Smalltalk, but the syntax is kind of like C. I've never worked with it, but the Cocoa framework and the Aqua UI is based on NeXTStep, and the Mac now uses Objective C as its primary development environment. So the NeXT system, developed in the late 1980s was the first "next generation" if you will of the system that was developed at Xerox. It was more expensive than the Mac, however. The Mac today is an inheritor of that legacy, made into a consumer product. It still doesn't fully embody what Smalltalk is, but it's the closest a consumer product has come.
Post a Comment