|Home » Learning Curve » Developers Workshop
Pizza Delivery Man
Reflections on Dewar/Schonberg.
It is sad that programming is becoming yet another wannabe art and is rather ceasing to be an art altogether.
- MN Karthik
First off: if you haven't read 'Computer Science Education: Where Are the Software Engineers of Tomorrow?' by Robert Dewar and Edmond Schonberg or Datamation's report on it by James Maguire then do it now. Then come back here if you still want to.
What Dewar/Schonberg have to say has been a long time coming.
Here's the skinny: computer science education in the US is teh suck. Really truly the pits. And most of it's related to use of Java as an introductory programming language but it actually goes a lot deeper.
What it's really all about is that enrolment tanked after the dot com bust and universities panicked for students. So they decided to make things easier - and guess what? They get more students who coast easier through their curriculum and then fall flat on their faces in the workplace. And coincidentally bring down the software industry with them.
That's the executive summary.
What's Right About Java?
The question could be better phrased 'what's wrong with Java' but that way the answer's interminably long. Java doesn't teach you how to 'think computer'. You don't learn how computers really work. You're basically where Visual Basic programmers were ten years ago. You cobble together a heap of spaghetti from a bunch of prefab libraries you don't understand and your app just about works. Congratulations.
You don't learn about instruction pointers, segments and offsets, allocation heaps, stacks, compilers, von Neumann architecture, CPU instructions, microcode - none of that.
You do something not more sophisticated than using MS Word.
But it's not only Java hiding the secrets of computer science - it's the professors too. They want money. They need enrolment. So they loosen the requirements. Where they should be submitting prospects to Darwinian aptitude tests to filter out the unworthy they're letting baristas study instead.
Of course this is nothing new. And as someone once said more than on a single occasion: when there's something dumb going down there's somebody familiar lurking around the corner and you should be able to guess who that is.
About 10 years ago Microsoft started offering certifications. Independent companies started selling course programmes for these certifications. These programmes cost a lot of money so the companies had to guarantee their students would pass their exams sooner or later.
HR departments started looking for these certifications when recruiting. It's easy to see what happened. Companies everywhere started to get in real trouble. They were taking people with no industry experience whatsoever on the basis of a piece of paper alone.
Not the First
Java wasn't the first mistake either - object orientation was. Object orientation stresses not seeing the details, no worries, not being concerned about hardware limitations, or what's possible and what's downright stupid. This isn't exactly good if you've never learned how computers work in the first place.
Many might point an accusing finger at Bjarne Stroustup and we'd be amongst them. C++ is not an object oriented language anyway - it's an impossible concoction that replaced the inaccessible Objective-C. Ask Alan Kay: he invented the term 'object orientation' and he's always been very clear about the Stroustup miscarriage.
Worst of all OO encourages reuse. Which is a good thing. But for neophytes who've never written code in the first place it's a disaster. The writing was already on the wall when one of the foremost experts on Windows driver programming was asked to create 'some simple C++ classes' so people would have an easier time of it.
Ten years ago the sister site Radsoft got a petition from a university student in the US. He was tasked to write a simple ping/traceroute program to pass his course and he wanted Radsoft to write the actual code for him so he could wrap it in Visual Basic and pass his exam. When he was told ping and traceroute weren't that difficult and that he might more profitably invest in actually studying - he turned vicious.
John Walker of AutoCAD fame has often seen the same thing.
It hardly matters to someone outside the US what happens with computer science in the US; but if the trends spread then it could spell doom for all. Java skills mean nothing - today's Java 'programmer' is tomorrow's pizza delivery man.
Dewar and Schonberg have their own recommendations. They're ADA fanatics so they're going to be biased. The basic ideas are however universal. They think a curriculum combining a number of languages with a foundation based in C is the way to go. That's hard to disagree with.
But their orientation (embedded systems) isn't for everyone and their experiences - no offence intended - are despite all limited. Here's a recommendation for a better and more universal programme.
- The aptitude test. Don't waste anybody's time. Find out at the get-go if someone is suited for the 'art'. If they aren't it's better they get out now. Don't be kind - be merciless. In some countries (where these proprietors are from) this is common; it should be common everywhere.
- BASIC. Not some fancy schmancy 'structured' BASIC and certainly not Visual Basic but good old 010 020 030 BASIC. If it doesn't even have GOSUB all the better. And not for a whole term - a few weeks at the most. And this to get a feeling for the 'linearity' of computers - despite how you're going to want to think of them later.
- Some basic stuff on transistors, CPU manufacturing, gates, CPU flags, rudimentary assembly with perhaps MVS, Intel, Motorola, POWER, et al.
- C. The rest of the first term. To start with. Because C is the portable assembler. Because to really understand C is to really think computer.
- Objective-C. Not C++ or anything else. And not Apple's flashy IDE but basic command line stuff.
- Teach the abstract but pressure the students to actually write software. As Brian Kernighan said: the best way to learn how to program is to program.
- Operating systems. And their history. And philosophy. And so forth.
- Programming languages. And their history. Start with FORTRAN of course. Look at Algol, COBOL, all the widely used languages.
- Now mix in all you want. All the other stuff.
A good programmer with a degree can hope to write code in the first two years on the job; the average programmer will mostly empty wastebaskets. This is OK - they don't teach you that much in Harvard Business School either.
Those at the top of the game are rare. IT companies survive by hiring the mediocre. But the mediocrity of today isn't what they'd in mind. The mediocrity of today makes the mediocrity of ten years ago look like a braintrust.
At a government programming office dealing with IBM mainframe routines 150 programmers wrote the code for the nighttime operations. Of these 150 programmers 148.5 wrote their code in COBOL and 1.5 wrote theirs in MVS assembler.
The 148.5 were responsible for 5/6 of the routines; the 1.5 were responsible for 1/6 all by themselves. Do the math: the assembler programmers outperformed their COBOL colleagues by at least one order of magnitude.
The assembler routines almost never crashed; the COBOL routines almost never ran smoothly.
Top of the game is rare. That government office would have rather had 100 times as many assembler programmers and zero COBOL programmers. But that simply wasn't possible.
Knowing limitations is however not the same thing as lowering standards.
Developers Workshop: The Fourth Rule
The Technological: Surgery for Dummies Next?
Datamation: Who Killed the Software Engineer? (Hint: It Happened in College)
Computer Science Education: Where Are the Software Engineers of Tomorrow?