|Home » Learning Curve » Developers Workshop
Why We Don't Like Swift
There's nothing wrong with excellence.
Swift is an abominable language. But so many neophytes like it! That's precisely the point (or one of them). If Steve Jobs was right when he estimated that good programmers were two to three hundred times better than run-of-the-mill programmers (and he was) then who exactly are we 'letting in' by dumbing things down?
It's been tried before. The ever-feminist Sweden, distraught that there were too few female programmers in the Land of Absolut™, tried slimming down their five-year curriculum at the Royal Institute. They tried to squeeze what they could into an all-new three-year variant. But it didn't work: corporations weren't interested in the new graduates, and the number of female applicants didn't increase.
Swift is an attempt to dumb things down. Not specifically to attract more women, but instead to attract more people in general. It doesn't work. Rubbish in, rubbish out, as George Carlin would say.
Apple's OS X environment is space age. Apple didn't write it, much less conceive it. NeXT did that - Steve's lumpy crew of programmers who were two to three hundred times better than the 'rest of them'.
There's nothing wrong with excellence. There is something wrong with saying there is something wrong with it, and acting on it.
Apple's old slogan is that their platform is for 'the rest of us'. They prioritise workflow and ease of use. Which is fine. The iPhone project, for example, was an example of how greatness can be channeled to great effect - for 'the rest of them'. But you can program on a tablet only with difficulty, and on a mobile you can't at all.
Apple's OS X is NeXT's NeXTSTEP and OPENSTEP which in turn are Alan Kay's Smalltalk systems. Alan Kay invented the term 'object orientation'. Given his vision, 'organism orientation' may have been better. What Alan could see, given his experience with Logo, is that a computer screen represented myriad organisms, each sovereign in itself. The way to program these organisms was not to use procedural code, but to send messages.
Brad Cox incorporated these ideas in a 'compiled' version of Smalltalk which eventually got the name 'Objective-C'.
Objective-C was initially a bit of a preprocessor to C. Changes to C were nonexistent. There were a few enhancements, such as the great '#import' directive, and, above all, the messaging paradigm.
The Objective-C runtime is a workhorse. Sending a message is not a function call. Sending a message involves checking the environment: first asking the target if it wants the message, then 'asking around' if perhaps someone else wants it, and so forth. Things like this are not possible in procedural programming - they're dynamic. (The overhead is negligible.) The syntax reinforces the paradigm.
Swift reverses this, much as C++ reversed the quantum leap from Pascal to C. And, much as happened with C++, Swift looks to become the bane of the industry.
Apple had always been about the 'rest'. Steve Jobs, after fleeing 'Siberia', went out searching for the 'best'. He found it. Ultimately he sold it to a dying Apple for $429 million. That technology became the kernel of Apple's resurgence, the single most credible threat to Microsoft's destructive hegemony, and the basis of the system powering iPhone. $429 million proved cheap when the purchaser was first in the world to hit the trillion dollar market cap.
It's a shame to see Tim Cook stoop to such mediocrity.
The Learning Curve
The learning curve to Objective-C and Apple's 'Cocoa' is almost nonexistent - provided the programmer is a 'real' programmer. Understanding C and the exigencies of C is of course a sine qua non, but of course that holds for just about anything in computer science. For those with the proper prerequisites, learning Objective-C can take less than a day. And the ordered structure of Cocoa is akin to the daylight greeting the protagonists at the end of the film 'Blade Runner': once you know how a single class is structured, you know how they're all structured, this despite the fact that the entire API is greater than Microsoft's shambolic Windows API by an order of magnitude. Cocoa - the original work by NeXT - is probably the most significant milestone in computer science since the 1970s in Murray Hill.
Yet the very existence of Carbon - prolonged in absurdum by legacy programmers not willing to take the 'leap' - speaks volumes. Never before had a transition been easier (or more exciting). As with C, the syntax of Objective-C leaps out at the programmer as a series of self-evident de facto truths. Of course '!=' means 'not equal'! Of course '' is a message, and right to left, as in assembler! Cocoa made life easier, made software more reliable, and cut development times by as much as 80%!
But, alas, Cocoa is not for everyone. Neither is programming. Programming is an art, and a joy, not tedium. But look at the shape of things today. Art hasn't been seen in a long while. And when's the last time you came upon a joyful programmer?