|Home » Learning Curve » Developers Workshop
What Open Source Can Learn From Apple
Both the good and (sometimes) the bad.
There's a lot of buzz right now about Google's coming open source Chrome operating system. And one of the things already seen is how pundits come out with articles about what open source (and thereby Google's Chrome) can learn from Apple. Mostly these articles are pure filler. The authors have absolutely NFC about how things really work. They're journalists and not computer scientists. They don't know. All they do is fill web page real estate and draw uniques.
But open source does have a lot to learn and Google may have a few things to learn as well. And a lot of the lessons to be learned do indeed come from Apple - both for the good and sometimes the bad. They certainly don't come from Microsoft!
Apple hardware. Open source contributors don't offer computer hardware. But a look at Apple hardware still teaches a lot. Steve Jobs has been known to compare Apple hardware to BMW. But perhaps a better comparison could be to Dior, Gucci, Versace: brand names with brand recognition that set the styles. Of course none of this is of any value if the products are no good, if consumers can't latch onto the new ideas no matter how much Kool-Aid they drink. But Apple have Jonathan Ive and some of the best designers in the world and NeXT had Susan Kare and Keith Ohlfs.
Lesson? Pay for a good design team if you can't get one to work full time for free. You have to have a good design team period.
Object orientation. There are network administrators out there who to this day teach Windows system architecture and begin with half a day explaining object orientation. Ask them what the F they're doing and they'll tell you 'NTx is object based'. Try to explain to them there's a world of difference between object oriented and object based and their eyes will glaze over.
Too few people outside the Kay/Tevanian/Jobs camp understand what object orientation really means. And that it doesn't have squat to do with operating system kernels. But everything to do with the icing on the cake. Steve Jobs pointed this out in his own 'Mother of all Demos' for NeXTSTEP 3.3 - that too many people tried to copy the 'icing' on the 'object orientation cake' without understanding what lies beneath.
Even Steve Jobs admits he didn't 'get it' when he first saw Alan Kay's lab. The original Macintosh was not object oriented - the interface 'pretended' to be object oriented. But the NeXT time around in Redwood City Jobs and his friends got it right. And that interface - today called Cocoa - is the only mainstream object oriented user interface in existence.
And it shows. Look under the bonnet of KDE and you'll see an utter mess. Look under the bonnet of GNOME and you'll see something almost worse. Even Mark Shuttleworth doesn't get it - he admits he's no systems programmer or advanced programmer but he still thinks pretty icons are going to make up for the lacklustre performance of the interfaces he promotes. It doesn't and it won't.
There are several historic reasons why Objective-C never made it beyond Redwood City (and later Cupertino). But they're actually excuses rather than reasons. OS kernels continue to be written in C and this is as it should be. But you can't address the complexity of user interfaces with C alone - this is what Microsoft have done and look at the results. This is what GNOME also do - again: just look at the results.
The pity of all this is that these people have never been exposed to true object orientation and they don't get it either. All most of them have seen is Microsoft Windows - and if you're going to copy from someone else that's about as bad as you can do it.
The original Macintosh was a copy - for both hardware and software - of the systems Alan Kay had running in his lab. They were made to look the same way at any rate. But Microsoft Windows is also a copy - a bad copy. At least the Apple people got it right as far as outer functionality goes. The Microsoft thieves didn't even get that right. And then along comes open source and who do they copy?
The kernel is one thing. And kernels aren't going to be 'object oriented' and they shouldn't be either. But user interfaces have to be. And yet no one but Apple today have an object oriented interface to offer.
It's not only a question of outer user interface design - it's question of how the code works. Development on Apple's OS X can go five times as fast - and with better and more stable results - as found on other platforms. C++ (used by KDE) is not repeat NOT object orientation. It's a mongrel at best, a disaster not waiting to happen at worst. Outside the world of Objective-C there's no flexibility. Nothing can be as dynamic. Write your OS kernel (it's already written) in pure C as Linus has always insisted. But stop using those lacklustre UIs like GNOME and KDE.
The ideal here of course would be if Apple and Google could effect some type of merger. Call it Gapple if you will. But Apple have the effective monopoly on good user interface design and development tools (including programming languages). This is something both open source in general and Google in particular can profit from.
One system - and only one system. This isn't something Google or the open source people are likely to founder on as their kernel is already written and agreed on. But Apple have in effect two disparate operating systems in the same system - at least on their computers. This never works. If you want security - and the Internet users are screaming for it - then you need one path and one path alone from user to hardware through the OS kernel. You can't have alternate APIs that do the same thing.
The old IBM PC BIOS was a mess. It was a quilt of parallel somewhat concentric interrupts that often did mostly the same thing. Easily exploitable. Apple mix in bits of their old beige box OS (Carbon and earlier) under the bonnet where Cocoa and more importantly Unix reside. You can't have that. Relying only on a Linux kernel will protect systems from this booby trap but it doesn't hurt to keep people aware of the dangers out there.
Let the other guy do the hard work. This is one of Brian Kernighan's three golden rules of programming. It sounds cheeky but what it really means is you never reinvent the wheel. Open source per se is a huge grouping of a great number of separate projects with separate programming teams. Each project contributes its bit.
You don't go and take a Linux kernel and then rewrite it substantially if you want to come with a complete OS. You work with the kernel 'as is'. If there's a reason to update that kernel then you let the kernel people do it. If you discover a bug or a security hole in the kernel then you don't fix it yourself - you tell the kernel people about it and stay on their case until they fix it.
Not only does this lessen the workload for all involved: it also makes the final product more secure. The more often you branch open source the more you get a mess of an overall product. And the more time it takes you to put through changes of your own when the source team come with an improvement. Open source modules should be used 'out of the box'. 'As is'. Don't change that code yourself.
And yes, Apple have eaten dirt too many times because of this. They can't use file system drivers out of the box because they want to support Finder info and resource forks and whatnot. Change a few things at the source and Apple have to wait until their key team have time to review the new code and see where their own previous changes can be reintegrated into the new release - and test the new code as well. In the world of security - at least now with the rout on Microsoft currently continuing unabated - you simply can't risk wasting that time.
There's little risk Google will screw up or screw with a Linux kernel. The greater risk is their user interfaces and programming interfaces won't be up to snuff. Their online graphics might be good enough for online use but they won't be good enough for local use on the local computer. Here is where Apple can be invaluable to Google.