I’m taking a few computer science courses this year, partly for fun and partly to backfill some of my skills. I’m almost completely self-taught as a programmer, and while I’ve been coding in various capacities for nearly 20 years now, I have had almost no exposure to academic computer science and very little experience with the languages that are principally used for teaching these days (C++ and Java). So I’m learning both of those languages simultaneously to get them out of the way. (As I’ve mentioned here before, I’m also teaching an online introductory web development class for University of Victoria that starts next month, and taking these classes now is a way to get the teaching part of my brain working.)
Figuring out how to optimize the process whereby a developer starts using a given technology is a big part of my business, so I’m carefully studying the way that academic programs get student programmers going in a computer science course. As you might expect, there’s a lot to contend with, and even on a good day things are pretty messy. You’ve got ten different types of students coming in with at least three or four different types of computers and operating systems. Once you figure that out, you need to figure out how to get languages, tools, and database servers installed on your computer. The information technology management challenge is steep, and there’s a chicken-and-egg problem at work here: students can’t set up a development environment until they learn how to be developers, but they can’t learn how to be developers until they’ve set up a development environment.
When I was an undergrad starting in the mid-80s, most people had to go to the computer lab to get their work done. I had my own home computer to do school work and work-work on, but I was almost totally on my own when it came to figuring stuff out, and because there was no dial-up internet back then, I still found myself having to schlep into the computer lab and figure out XENIX to test and turn in assignments. That crummy logistical experience turned me off to the whole notion of academic computer science for many years.
The classes I’m taking this semester each devote a good week or two (out of an 18-week term) just to getting students ramped up. That means that almost 10% of the class is devoted to preliminaries. There is lot to do: you have to establish an identity on the college’s server, log in and figure out various unix commands and compiler options, as well as install a complete development environment on your own PC for development and testing. This kind of thing is not a challenge for a professional developer, and everybody should certainly learn how to do it, but a lot of the people in these classes are teenagers with little more than advanced browsing skills; it may not be the case that week 1 of Introduction to Programming is the right context for these kids doing this kind of activity. It can’t help their desire to learn to have to go through a couple of weeks of fiddling with command-line parameters to get stuff to work, and they can certainly figure out the details in a system administration class to be taken after they’ve gotten “hello, world” working.
It seems like this process could be significantly streamlined if schools would make a standard, pre-configured machine image available to students that they could then download and use for their classes. They could use something like the free VMWare Player product to put together an operating system and development environment that students could download for free. Virtualizing the development environment has other benefits to this as well — for example, if a student were experimenting and did something to mess up their computer, they could simply blow away the VM and re-download it. So not only would this make the first weeks of an undergraduate CS class go faster, it would also cut down on calls to the campus support desk, and probably save a bunch of money.