One of the very interesting things we’ve learned since launching CodeLesson at the end of last summer is that a lot of people have no good sense of what it means to take an online course, even though the concept of taking an online course has been around for a very long time (predating the web, in fact).
There’s an unfortunate namespace pollution at work here. I see this done by companies that want to associate their media products with the caché of the classroom or the university. For example, “iTunes U” is not a “University” by any stretch of the imagination. Any “online course” that’s comprised of web pages and/or videos is not actually an online course. And so on.
Videos can have value, obviously. But let’s not trick customers into thinking that your pre-recorded online video is the same thing as an “online class”.
By our definition, a real class has structure, a set agenda (so you know ahead of time what’s covered in the course) and a way to reinforce and verify what students have learned (using hands-on exercises and quizzes).
And, crucially, real courses have instructors. Without an instructor, students have no straightforward way to get questions answered, you have no idea when you’ve learned something incorrectly or incompletely, and you have no idea whether you’ve achieved real proficiency.
At CodeLesson, we charge money for nearly all of our courses; this enables us to hire world-class instructors who can spend time answering students’ questions. The amount of money we charge students is generally more than you’d pay for a pre-recorded video, but a little less than what an American college student might pay for a university course. But a common criticism of our model is “why should I pay you when I can get what I’m looking for from Google” (or Bittorrent, etc.)?
You can learn stuff by Googling, sure. When you look something up on Google, you get 1,000,000 search results for every phrase you type in, you have no sense of whether the information is complete or even accurate, and you have no sense of when you’ve learned enough to be proficient in the subject you’re interested in. This is totally fine if you’re looking for a high-level overview of something. But it’s not an ideal tactic for managing the continuous knowledge acquisition that a professional should be basing their career upon (particularly if the value of your time is greater than zero).
Not everyone is an autodidact. Not everyone should be an autodidact. I think that people are forced into being autodidacts because the two institutions they historically relied upon to convey technical skills (the university and their employer) have utterly failed at supplying the material that technical professionals need to learn today. This didn’t used to be the case (I’ve blogged a few times and over at the CodeLesson blog about how I traveled the world doing developer training in the 90s).
But the point (and I do have one) is that it’s no longer necessary to travel the world to get access to expert knowledge. Learning can happen online, effectively. And it’s terrific that there are lots of formats and price points out there. But it’s time that we make a clearer distinction between the different types of online learning to avoid snowing prospective customers.