After VITTA 2007 (critique) I stopped going to computer conferences because the keynotes had become dominated by web2.0. Web2.0 of course is good but when you divorce it from history, philosophy, epistemology and add in some evangelicalism then it's not healthy.
It's good to see this shallowness being vigorously challenged by ceolaf in the comment thread of Wil Richardson's Digital Inclusion post
As part of the discussion ceolaf linked to The Partnership for 19th Century Skills by Diane Ravitch, which once again drives a front end loader through the gaping holes of the 21st Century skills rhetoric
I left a comment at Wil's site but couldn't put it where I wanted due to the nesting levels feature. At any rate, what I wanted to say was that the best broad brush attempt I have seen yet of identifying the fundamentals that ought to be taught in school was made by alan kay in his outline of the non universals. I have made a beginning attempt to put this in one place: non universals
I did go to the CEGSA (Computer Education Group of South Australia) conference this year (cegsa09) and once again was disappointed with the keynotes, with the notable exception of David Loader, who does have a real sense of history (in part because he has made it), the wisdom of our elders, learning theory and human psychology. I don't want to offend the hard working organisers of this conference. The workshops were good. I do think, however, that there is an ongoing issue with keynote speakers at computer education conferences owing to the narrowing influence of the web2.0 movement.
From April last year: web2.0 introspection
A fake paper published in a peer-reviewed journal?
-
I presume that the International Journal of Surgery Case Reports is a real,
peer-reviewed journal rather than a complete fake, as this paper is listed
in P...
3 hours ago
3 comments:
The discussion between Will Richardson and ceolaf was interesting. The sense that I got from Will is he's saying "Our kids live in a digital world. We should teach them how to exist in it, to interact and to learn." He was concerned about how students are socially situated given the new context. ceolaf's point was that "the timeless lessons" (one could reasonably equate this to the non-universals) should be taught and that technology should be evaluated from that point of view, to see if any of the technologies enhances the teaching of those lessons. What I detected in the discussion from the other side was a question about whether the new context perhaps brings in new lessons that we are not yet aware of, and that if the use of technology is "limited" to teaching the "timeless lessons" that we're cutting off new avenues for learning. That seems like magical thinking to me. Why not evaluate the technology with the "timeless lessons" in mind and see what new lessons can be added to the "canon", if any, rather than thinking something new will just magically emerge?
I thought ceolaf's POV was brought into high relief when he dialogued with Clay Burell on teaching about climate change. Clay argued that climate change is a new reality not previously anticipated, and that new skills should be taught to students about how to deal with it. ceolaf scoffed at the notion, saying that we should teach science, ecology, and awareness of our interconnectedness. Those are the basic lessons that make up the awareness about this issue.
I think what ceolaf points out well is how educators can get caught up in issue or trend fads, and think that these things represent very new contexts for which "the old knowledge" no longer applies.
What I think Will and Clay exhibit is a mindset that says that students must be taught how to live in what is seen as the current contexts, and they have a skills focus. I've written about this before: skills are useful, but if that's all you're taught, it's a brittle framework. What do you do when a new situation comes along that your skills cannot deal with? Acquire new skills? Where did those skills come from? Probably someone who didn't think in terms of skills when they discovered and thought about the new context.
ceolaf says the context matters some, but it may only have limited applicability and utility to education. What's more important is long range historical awareness (to help the student situate themselves better in current contexts), and "the timeless lessons" which will give them the "brainlets" (as Alan Kay has called them) to perceive any context they're in (current or future) more clearly.
The "fad & skills" mindset was around when I was in school. The big thing back then was personal computers, and the thought was that an important skill was going to be computer programming. So they taught that in the schools. They saw logic as an essential prerequisite to being successful at it, so that was incorporated into the programming courses, but there was no awareness of what computers or programming really represented in terms of the non-universals. Personally I loved programming, but not because I recognized I was going to need it someday, or it was going to enhance my career opportunities. I liked it as a means for molding a machine to create what I wanted. It was like painting on a canvas for me (with LOTS of corrections). It did enhance some of my learning abilities. At least that's what my mom noticed. She was a teacher herself, so I assume she could see that clearly.
The thing was, after 20 years or so the fad of teaching programming in the schools petered out. From what I've heard you and Mark Guzdial write about it barely exists now except in AP CS courses in the U.S., where unfortunately students are taught Java as their first language.
What's wrong with Java as a first language?
@Jason:
I was going to go into this but thought it would take things off topic. When I took CS the introductory language was Pascal. I'm not saying I prefer Pascal, but compared to C++ and Java I'd say it's better as a first language. Pascal was designed by Niklaus Wirth as an educational language. It was designed with a pedagogy towards teaching procedural structured programming. Again, I'm not saying that's that's preferable to a different approach.
The thing I notice about Pascal is that it doesn't introduce you to concepts before you're ready. This is what I'm getting at. A simple hello world program in Pascal looks like this:
Program Hello;
begin
writeln("hello world");
end.
In Java it looks like this:
class HelloWorldApp {
public static void main(String[] args) {
System.out.println("Hello World!");
}
}
The elements of the Pascal version are not that hard to explain to a beginning programmer. There are about 6 concepts a beginner would have to understand to fully comprehend what the Java version is doing.
I'm not saying one language is better than another on merits, but rather which is easier to learn so that students can start getting some experience in learning how to program without having to accept terms and concepts they don't understand up front just to get started. There are other possibilities that could be used. I personally like Squeak. That's been tried. Other suggestions might be Ruby or Python as a first language. People can get to Java eventually once they've gotten comfortable with the ideas of programming.
I've even heard assembly language suggested as a first language, because it's a linear style of coding which most people can relate to easily, and depending on the processor the command set is pretty simple. It also teaches some fundamental ideas about computing very quickly. I don't know. I kind of agree with the idea, but at the same time I'm wary. Introducing students to addressing quickly might be too much for a beginner. Then again, when I took assembly I was shocked at how much more reliable my programming became. I had assembly programs I wrote that worked on the first try! It was rare if I had that happen in a HLL. I don't make it a practice to program in assembly, but for whatever reason it was a lot nicer than I anticipated. I guess the equivalent in Java would be starting out programming at the bytecode level in some sort of assembly.
Post a Comment