About | ACP | Buy | Industry Watch | Learning Curve | News | Search | Test
Home » Learning Curve

Gonna Switch?

Reasons why you should - and perhaps should not - migrate to Apple.

If you've never heard of Ellen Feiss, make the acquaintance now. Ellen is the most famous person to come out of Apple's much talked about 'switchers campaign'.

She's something of a legend.

The Apple switchers campaign spoke to ordinary users and explained things in terms they could understand.

I'm writing to share a tragic little story.

My Dad has a PC that my sister and I used to use for our homework assignments. One night, I was writing a paper on it, when all of a sudden it went berserk, the screen started flashing, and the whole paper just disappeared. All of it. And it was a good paper! I had to cram and rewrite it really quickly. Needless to say, my rushed paper wasn't nearly as good, and I blame that PC for the grade I got.

I'm happy to report that my sister and I now share an Apple PowerBook. It's a lot nicer to work on than my dad's PC was, it hasn't let me down once, and my grades have all been really good.

Thanks, Apple.

Ellen Feiss

Which is about as direct as you can get, and Ellen's experiences are hardly unusual: Windows is a kludge and Apple is as good as she says. But for the more technically inclined - and for those really dragging their feet - here's a bit more to whet your appetite.


Ellen's story focuses on the stability of the two operating systems Windows and OS X. Windows is indeed a kludge, a wobbly hodgepodge that doesn't even deserve to be called an operating system, whilst OS X is at the core a Unix, and you can't get much better than that.


Windows will most likely go down in history as the most vulnerable platform ever. It's hard to conceive of a worse situation - and this is with Microsoft's Service Pack 2 for Windows XP factored in. It's still a mess, and it still is incredibly full of holes and always will be.

The Windows defenders like to use the old 'Harrisburg' type of spin each time a new outbreak occurs: Windows is only more vulnerable because... and so forth, so that in the end you completely forget the real issue: it doesn't matter at all why Windows is so vulnerable, only that it is.

If you run Windows, you're going to get clobbered all the time. And incidentally are probably going to invest a lot of money in add-on products that don't really help you that much anyway - the cost of which has to be factored in as well. Your PC is not really as cheap as you make it out to be, and with all the hair-pulling it's not really worth it.

Try to get this and get it so it sinks in: people who run OS X don't go through any of the stuff Windows users get hit by: you're not going to get hit through your mail client or through your browser; you're not going to become a zombie with trojans and keystroke loggers running around on your system; you're not going to have bank account and credit card information being sent back to a 'mother ship'; in fact, your life will become so calm you'll soon forget what all the worry was ever about.

According to Earthlink, the average Windows PC is infected with thirty trojans; the average Apple OS X computer is infected with none.

Good Hardware

PCs are 'rush to market' jobs. They're invariably poorly designed, aesthetically ugly, and doomed to fall apart close to when your one year limited warranty runs out. There is a single exception: IBM. IBM hardware is good and has always been good. But considering all the rest, it's still not a prudent idea.

Many live under the belief Apple hardware is more expensive, but it's not. The initial outlay for the box itself may be more (or may be less - see the new iMacs as an example) but over time you end up saving money, not wasting it.

Statistically Apple hardware has a better record than even IBM. This means that the box you buy today should still serve you well several years down the line - long after your x86 junk box literally fell apart.

Apple also have an immense advantage here that the x86 OEMs can't do anything about: Apple make the entire box, and for their own operating system. Peripherals are compatible; computers 'just work' right out of the box. You don't have to worry about the operating system detecting and understanding new components you want to put in because the system already has them and is well acquainted with them.

Apple computers also come configured to the hilt: what you get default on an Apple you will never get with an x86. The x86 price might look lower, but when you start asking if they have this and they have that, the price goes up dramatically. And when you start adding things on, you run into an entirely different problem: as these 'super boxes' are not the main output of their assembly lines, they won't be tested as well. You're much more likely to run into snags and difficulties getting your machine to perform - despite paying those big bucks.

And Apple - like IBM - have a reputation for high quality hardware that they're not about to jeopardise. Each and every component of your Apple computer goes through rigourous tests before it leaves the factory; Windows PC OEMs rarely do any testing at all: if the machine boots Windows, it's good enough, and if there's anything wrong, it's cheaper to let the customer find out later.

PowerPC Architecture

Apple computers use a different processor. They use the PowerPC. The PowerPC is a project initially created by Apple, Motorola, and IBM. Motorola left the field, so today there's only Apple and IBM left. IBM do the engineering. The 64-bit PowerPC of today is based on IBM's 'Power4' architecture. In strict abstract terms, the PPC as it's called has it all over the x86.

The x86 is actually a throwback to the first single chip processor ever, the Intel 4004. Back then just getting 8 bit registers on board was a gargantuan feat. The instruction set of the 4004 was 'octal' and its on board registers were few.

The PowerPC was designed in an era when the severe constraints on Intel for the 4004 were no more. Where Intel struggled to access memory in twelve clock cycles, the PowerPC from the outset did it in one.

The PowerPC was from its first release so well designed, so perfect in its boots, that only speed was needed to make it more and more viable. And certainly both the x86 and the PPC have undergone tweaks and improvements since then, but the basic difference remains: the PPC is good at doing the kinds of things we want to do with our computers today, and the x86 is not.

A lot of people make a lot of stupid stink about clock rates in processors, as if this and this alone determines how fast and well your computer runs. But consider the following: the traditional x86 CPU has only four general purpose registers whilst the PPC has thirty two. Where the Intel has to continually switch things in and out of the registers to complete calculations (and perhaps at the rate of one instruction per twelve otherwise fast clock cycles), the PPC can keep it all 'on board' and only store things in memory when the calculation is complete.

Even if the PPC lagged far behind the x86 in pure clock speed, it would make up for this and more with its more sensible on board efficiency. It's like seeing a two cylinder lawnmower engine rip up the environs but generate no real power: it's not the RPMs which are determinant.

PowerPC processors have now officially graduated to a 64-bit architecture whilst only AMD have a viable product on the x86 side. And while the x86 continually thinks about 'backward compatibility' - all the way back to that pocket calculator 4004 - the PowerPC surges ahead.

It's hard to appreciate how much more bang Apple's OS X makes for the buck: screen coordinates, colours, and alpha (transparency) values are expressed in floating point with an incredible precision, whilst Windows and most other platforms still struggle around with 32-bit integers. Windows and other platforms deal in actual pixels on the screen - 'raster' thinking - whilst OS X deals only in 'vectors' - in terms of screen brilliance, OS X is light years ahead of Windows and the rest of the competition.

But of course Microsoft want to catch up, and many of the features they're boasting about for their coming 'Longhorn' will only be bastardised versions of what Apple already have, but that still isn't the punch line: to get this same bang on a Windows box you're going to have to pay a lot more and you're going to have to upgrade, and to a new kind of machine that isn't even being sold yet, while you can get all this today with an Apple with no pain and relatively few bucks.


Apple cofounder Steve Jobs left his company almost as soon as the original Macintosh made it to market. He broke ground not far away in Redwood City to build another computer, the NeXT. Its operating system NeXTSTEP is what you get today when you buy an Apple.

[The World Wide Web was invented on a NeXT computer - by Tim Berners-Lee who called NeXTSTEP 'the first intuitive point-and-click and folders interface for personal computers'.]

It looks a bit different than NeXTSTEP did, but it's the same basic system. NeXTSTEP is actually a very unique brand of Unix: it uses the 'MACH' micro-kernel architecture devised at Carnegie-Mellon University to run the BSD Unix kernel. BSD Unix comes from the University of Berkeley, and the version Apple use is open source and called 'Darwin'. The very fact that the code is open means more eyes will review it and it will be better and much more stable. You won't find Microsoft doing anything of the same because they'd be embarrassed to have people seeing what terrible code they write.

And NeXTSTEP itself is actually a 'GUI' placed on top of all that. And here's where it all starts to get very interesting, for NeXTSTEP is 'pedigree' as no other GUI anywhere can claim to be.

GUIs were brought into the research mainstream by Alan Kay of the Learning Research Center of Xerox's Palo Alto Research Center (PARC). Kay and his colleagues devised a programming language called Smalltalk specifically for the kinds of environments (with graphical user interfaces) they would be dealing with.

Smalltalk was a 'research' language: it 'interpreted' instructions rather than 'compiled' them. If you're not aware of what the distinction is, then all you need to know is that 'interpreted' code is slow, 'compiled' code is fast.

In other words, 'interpreted' code is fine for the research laboratory, but it's nowhere good enough for production - for commercial use.

So along came Brad Cox who took the Smalltalk paradigm and made it 'compiled' instead of 'interpreted' - he made a 'production' model of Smalltalk. And to this day, his new language, called Objective-C, is the only computer programming language descending directly from Alan Kay's Smalltalk.

I'm writing to share a tragic little story.

My Dad has a PC that my sister and I used to use for our homework assignments. One night, I was writing a paper on it, when all of a sudden it went berserk, the screen started flashing, and the whole paper just disappeared. All of it. And it was a good paper! I had to cram and rewrite it really quickly. Needless to say, my rushed paper wasn't nearly as good, and I blame that PC for the grade I got.

I'm happy to report that my sister and I now share an Apple PowerBook. It's a lot nicer to work on than my dad's PC was, it hasn't let me down once, and my grades have all been really good.

Thanks, Apple.

Ellen Feiss

And it shows. Whilst developers on other platforms struggle with 'hyprid' (hodgepodge) languages like C++ to write their code - and invariably trip up on all the artificial complications such a venture entails - Objective-C developers - for the Apple platform - find things self-evident and very clean and clear from the outset.

All good programming today is based on the C programming language by Dennis Ritchie of Bell Labs - the language used to make Unix the universal system it has become. C++ actually perverts the language in an attempt to both introduce concepts distantly akin with Smalltalk and to reintroduce some 'fetishes' its creator had, implementations otherwise castigated and abandoned by the programming community years earlier. C++ programming - and most GUI work outside of Apple is today in C++ - is a hair-pulling experience, and programs built with that language will never attain the conciseness, the clarity, or the stability of their Objective-C counterparts.

NeXTSTEP is further, like Smalltalk before it, more than a users environment: it is simultaneously a developers environment, and already at its inception more than 15 years ago was light years ahead of what is available on any other platform, and it retains that distinction to this day. Programmers working with GUIs are invariably confounded by just how indirect and difficult program design can be; they can see how things should be, but are more or less helpless to change things on their own.

With NeXTSTEP it's all already been done. Things are already the way you want them to be. The simple task of connecting actions and outlets as they're called is truly simple - you don't need encyclopaedias of code. NeXTSTEP relies like no other platform on sensible reuse and the benefits to developer and user both are immeasurable.

Put simply, your OS X programs are going to be better, run faster, and be more robust than what you could ever hope to see on any other platform. And you'll find yourself never paying for upgrades just to get rid of bugs: for the most part, the bugs are never there in the first place.

Why Not?

There are disadvantages and drawbacks too - and if you're really going to migrate, you should be aware of them. Nothing measures up to the arguments in favour of switching, but no system, no choice in life is perfect, and you're better off knowing where the warts are.

The NeXT Mac - An Unhappy Marriage?

This might go a bit deep, but its effects will still be felt by the ordinary end user: OS X is actually a 'marriage' between traditional 'Mac' thinking and NeXT's 'Unix' thinking - and realistically, the twain can never meet.

Unix is an extremely flexible operating system, perhaps the best the world has ever known. It is especially suitable for our online communications of today, and what with its basic tenets, its being open source, and with all the work done on it over the years, it's basically impervious to malfeasance attacks. In fact there is good cause to argue that if the world was running Unix today instead of Windows, the 'hacker culture' as we've come to know it would disappear: there just wouldn't be enough to attack.

But all the while Apple cofounder Steve Jobs was in Redwood City bringing NeXT to market, Apple lived on in Cupertino under John Sculley in what might be called 'the dark years'. Nothing of any import happened; Apple's product line stagnated; and those that stuck with Apple products instead of opting for Jobs's 'NeXT great thing' consciously chose to do so. And not all of them may have been happy to see him return.

The 'Mac' way of thinking - as epitomised by the infamous 'Mac community' - is a 'top-down' way of thinking - a 'form over function' paradigm. Developers who stuck with Apple, as opposed to those who 'followed along' Jobs to NeXT, are today stuck in an antiquated architecture Jobs and Apple would love to but up to now have not dared abandon in its entirety.

If you're completely new to the Mac platform, this might all take you by surprise and seem incomprehensible, but it's been fairly common that good software ported to the 'older' Mac has failed whilst bad software written directly for the Mac has succeeded because even though users knew the product was a bad performer, the latter just looked better.

Doug McIlroy, mentor of the team that created Unix, expressed the matter as follows.

The day Linux gives the Mac diagnostic that you can't open a file because you don't have the application is the day Linux becomes non-Unix.

Everything is expected to be 'a certain way' on the Mac, and despite the operating system being completely new, the 'Mac' diehards who do not appreciate Steve Jobs or what he did in Redwood City can yell louder than the 'Nexties' who are happy to see NeXTSTEP finally take its first footsteps out into the real mainsteam.

Most OS X gurus are in agreement that the only warts the system has are those forced upon everyone by this group: they yelled, they protested, they literally booed, and Jobs and Apple tried to accommodate - and the results each time have been given: the system is just not as good once they bullied and got their way.

Interfaces become confused; things work one way but at the same time work another; information from the operating system is given with reluctance as the underlying issues are truly embarrassing.

So there is a war going on in Cupertino, and the Apple legacy there constitute 'the bad guys' whilst the 300+ NeXT engineers who followed Jobs back are 'the good guys' - and everyone already knows this, and yet the difficulties still persist.

This website has ample examples of when the 'old' and the 'new' conflict to no one's benefit.

And application development can take two turns as well: the Macintosh Toolbox is no more, but there's Carbon instead.

Neither of which have the slightest to do with NeXTSTEP, Objective-C, Alan Kay, or any of the other wonderful things with OS X. But they've been wired in to give old 'Mac' developers a chance to catch up - unfortunately most of these developers have not used the opportunity as the 'transitional thing' for which it was intended, but continue to churn out architecturally (and aesthetically) inferior products.

It's fairly easy to see what's 'Cocoa' (real NeXTSTEP) and what's Carbon once you get used to the system. Cocoa apps take advantage of the full power of NeXTSTEP (and they look a lot better too) whilst Carbon apps get nothing for free, written in code so tangled even a Windows developer would turn pale over.

OS X comes with so much technology already built in that a lot of the 'legacy' software for the old Mac, even if it's updated with Carbon, is actually irrelevant - and you, as a user and potential customer, have to keep on the alert about this.

Almost all of Apple's own software offerings - and you get a lot already with your computer out of the box - are genuine Cocoa apps; it's the third party vendors you have to keep an eye on.

What About Linux?

Yes, what about it? Linux made it to Mars. Linus Torvalds has forever written himself into computer science history, leading in his inimitable laid-back fashion this open source Unix movement originally designed to give him a low cost alternative to SunOS. Linux is free - not always to no cost whatsoever, but it still can be so. And even if you pay a bit for a CD release, you might figure it's cheaper than migrating to Apple: you can often use the same PC hardware Windows used to get you into so much trouble.

And Linux is still a far cry from Windows: it too is Unix, and has all the concomitant stability, power, robustness, and imperviousness to hacker attacks all the Unix releases have.

But whilst OS X is a perfect fit for the typical end user, Linux is not. Linux has grown to almost be 'de facto' on network servers (especially on the Internet together with the web server Apache) but a server installation is by definition an installation you don't have to work with all the time - it just sits there and does its thing.

So even if it can be a lot more work to get a Linux box up and running, once you do it's a perfect choice for the back room with all the servers: it's going to just run and run and run...

On the client side - on the end user side - the geeks responsible for the various 'distros' of Linux have not come as far, and odds are they never will come as far as Apple have with OS X.

For the one, they're invariably using the same sort of inferior tools Microsoft use. Their code can easily become overly entangled. (This is at the application level, not at the actual operating system level, where Linus would never tolerate such nonsense.)

For the other, the Linux distros - as you will see if you check them out - never had the benefit of skilled graphics designers to create their visual interfaces because they've been 'open source' all along - and 'free' in the sense no one was ever getting paid. Linux GUIs are getting better, and at least the latest release of KDE has made great strides, but they're working with inferior tools, with no real on board expertise at graphics design, and their offerings will always look the 'cheap copycat'.

Setting up a Linux box at home can be a bit rough too. Nothing is given; anything may decide not to work; you're in the 'Windows' hardware market where there are so many peripherals and components your head will spin. Attaining the kind of relative compatibility Microsoft achieve is a very expensive process - a process Apple can sidestep completely and one the Linux community simply do not have the resources to address.

And at the end of the day, no matter how well you get your Linux computer to purr, it's not going to be substantially or even theoretically better than your OS X. At the very very best - something really unattainable - Linux would be as good, but no more.

So it turns out to be a lot of effort - which may or may not succeed - to get the same thing a quick Mac purchase would have given you weeks or months earlier.

Where To Go From Here?

Will the world go the way of Unix? The servers of the Internet already have; all that remains, despite the statistics continually citing Microsoft's 95% market share on the end user desktop, is for shops and kitchens to migrate as well.

Given the obscene level of disadvantages in even daring to connect a Windows PC to the net today, that final process should not take long at all.

Apple are not and should not be the only high quality supplier of Unix-based computers. The hope is also that their NeXTSTEP technology will not remain in the margins but make it out into the other 95% of the market where it belongs so everyone - developer and end user alike - can benefit.

Further Reading
What The Press Has Said
Apple's Top Ten Questions
More OS X Switcher Stories
Top Ten Reasons To Switch

About | ACP | Buy | Industry Watch | Learning Curve | News | Products | Search | Substack
Copyright © Rixstep. All rights reserved.