|Home » Learning Curve » Developers Workshop
Apple: When Closed Systems Don't Work
It's not like everyone didn't tell them so.
Open systems: they're great. They're built by various companies (and individuals) for a common goal. The code is subject to peer review.
Closed systems: they're not great. They suck. They're built by one company and the source code is kept secret. There is no peer review. But there are lots of bugs and vulnerabilities.
Debian, FreeBSD, OpenBSD: examples of successful (and useful) open source projects that really work and really contribute. Each project can take care of its own needs and pass the benefits on to anyone who wants to use the code.
Find a bug in the project? Tell it to the owners. Let them sort it out.
Open source projects also save a lot of time. The project owners take care of development. The users only benefit. Users can contribute too - but submissions go through a rigorous review before being put in the source tree.
Apple engineers use a lot of open source code. But the way they use it means more and more work all the time. Each module has its own caretaker at Apple. This ends up being extraordinarily complicated (and dangerous). No one else in the company can use an update until the caretakers have force-fitted the code to suit Apple's needs.
There are myriad examples. One of the most memorable was the sudo update which reached the rest of the world some 18 months before Apple adopted it. Updates of sensitive modules invariably contain bug fixes and plugged vulnerabilities. Uber-hacker Charlie Miller pointed out how easy it was to hack Mac systems: just check their huge 'acknowledgements' file on every system and compare versions with those published online at the project websites. Are the Apple versions outdated? Yes, most often. Then check the change logs at the project websites to see what holes they've plugged - those holes will still be open in Apple systems. Burnt toast.
That first file is now half a meg in size. There's a lot there.
The Oompa Loompa scare is a good example of how things can go wrong. Creator code and file types from the old standalone Mac simply don't work in the Internetted world. They can't be dimensioned to a worldwide audience. The idea every third party vendor apply for codes to Cupertino is ludicrous.
The forerunners of the extended attributes are ridiculous in today's world as well. This of course didn't stop a number of higher profile freaks from making a big stink when an Apple technical note pointed out their futility. Extended attributes of today (with the new house-trained 'Unix' interface) can be used for anything one wants - but they're not portable. And yet some better known techie wannabes cried when they realised they wouldn't be able to use them as search keys anymore.
And the author of Oompa Loompa stepped right in. And showed the world how easily such infantile 'technology' could be exploited.
Teacher aids required special login and logout procedures. Apple's response led the way to the Opener hole which until very recently has been the biggest security hole ever in the world of personal computing. A hole so big that the author described it as a crater. Pop a file in a specific format into an unprotected area of the file system and own the entire system on the first reboot. Opener was a horrible eyesore that Apple still needed more than five years to sort out.
And what with all of Apple's funky appendages to ordinary straightforward files, they of course have to modify FreeBSD intrinsics so their operations work correctly - and that leads to many wondrous things like the 'massive data loss' scandal.
Why do things like these happen? Because Apple stubbornly refuse to stand on the shoulders of giants, because Apple refuse to open their system code to peer review, because their managers get such strange ideas and are never opposed by the people who should know better.
Rob Braun tried to make Darwin 'open' and in the end had to give up. He estimated there were only a handful of people who'd been able to build the kernel from the files provided. Where's the peer review there?
Companies like Adobe with their flashy interfaces want to keep their source code proprietary. Let them. No one cares much if Adobe apps crash (outside the world of Adobe). But kernel code? And driver code for that matter? That has to be open. The alternative is what happened this past weekend with Apple's OS X Vista.
For it turns out that the engineers at Apple have really outdone themselves this time - they've made a system that can be hacked to root by anyone running code on an admin account. For the first time ever on any personal computer anywhere. With no trickery, no 'hacking', no shell code. Just issue a few commands either from a command line or hidden within a seemingly innocuous app downloaded online. Do that and the host system is totally toast. Totally. No reboot required. Pwnage is total and instantaneous.
People start wondering.
- How could Apple let something like that out the door?
- Didn't Apple have anyone performing security audits on the system?
- Why did Apple change the code in the first place? That code used to work!
Those are all valid questions. But they miss the great point in the bigger picture - namely that every human enterprise is subject to error, that code owners are the worst possible auditors for their own code, and that it took Unix thirty years to get to the point where it was ready for the evils of the Internet.
Unix isn't a system built with security as the sole priority. But it had security as an obvious requirement. The environment of Bell Labs demanded a multiuser system - in contrast to the 'Lego' quality of early PCs and Macs.
A lot of that early Unix code was very clever. Take the login process for example. Those Unix gurus didn't give things away to potential intruders the way Apple do. Your login screen in Unix said only the following.
That was it. You typed in something supposedly representing your account name at the prompt. You hit return and the system burped back at you.
The system never told you if the user name was correct.
But then some clever dude thought of a way to hack into things. He created a fake login program that behaved and looked the same way - but the program stored all the user names and passwords submitted.
He put copies of this program on all the Unix terminals in his office landscape. And caught the odd admin or two.
The same thing happened with the $PATH variable. $PATH used to include the current working directory. So the clever hacker figured out a great plan.
- Start a rumour that you're a big bad hacker and you can destroy things.
- Wait for one of the system admins to take the bait.
- The admin will cruise into your user area looking for suspicious files.
- Said admin will use the 'ls' program to list your files.
So you plant your own hacked version of ls in each of your directories. The admin's going to be running as root when he pops in for a visit. So let his invocation of your own ls be the start of more root code that creates a ghost account and a back door that only the superusers could otherwise do. System is toast again.
These and countless other kinks (see Hacking Exposed) were there in the nascent years to give system architects new challenges and hair-pulling sessions. But most of them are twenty or more years old. Unix still gets hit with new hacks now and then but it's rare. Unix never gets hit with the type of childish code scandal Apple's code gets hit with so regularly.
This isn't to say developers outside Cupertino don't make stupid mistakes - it's only to say that stupid mistakes outside Cupertino are most often caught in time: things aren't pushed through design meetings in such a panic, nobody's in a rush to get untested system changes out the door, and said changes are always reviewed in a fashion the secretive Apple can never accomplish.
Unless Apple are correct in their stubborn belief the 'computer' will disappear and be fully replaced by 'tap devices' (and even then they're vulnerable, as McAfee's CTO George Kurtz has demonstrated time and again) then they need to take a critical look at their production methods. Their wasteful development costs are significant but their losses in prestige are even more so.
Rixstep Industry Watch: Opener 3.9
Rixstep Industry Watch: Get Root on 10.5.4
Rixstep Industry Watch: You're Root, Dude!
Rixstep Industry Watch: The Story of Renepo
Rixstep Industry Watch: Got Lion? Get OWNED?
Rixstep Industry Watch: Oompa Loompa Quotes
Rixstep Industry Watch: ARDAgent: Here to Stay?
Rixstep Industry Watch: A Leopard Mail Vulnerability
Rixstep Industry Watch: It's Not New It Starts with 10.2
Rixstep Industry Watch: The Legend of Oompa Loompa
Rixstep Industry Watch: For Apple, This is the Year That Wasn't
Rixstep Industry Watch: 'Huge, Crazy, Ridiculous OS X Security Hole'
Rixstep Learning Curve: Rooting 10.5.4
Rixstep Learning Curve: Apple's Wi-Fi Fallout
Rixstep Learning Curve: Son of Input Manager
Rixstep's The Technological: The Not So Sinister Finisterre
Rixstep's Coldspots: What's Wrong with This Picture?
Rixstep's Red Hat Diaries: Screaming Apple Fanboy Idiots
Rixstep's Red Hat Diaries: Number One at Almost Everything
Rixstep's Hotspots: Leopard: OS Xhumation
We at Apple take security very seriously.
- Apple Inc
I believe that something big is going to happen.
- yankeefan24 on Oompa Loompa
An Apple spokesperson was not immediately available to comment.
- Reuters/CNN Money
It isn't anything. I opened it in Terminal and it did nothing. I checked the logs and the running processes and there was nothing foul going on.
- Phreak.net on Oompa Loompa
This is a very very sad day for the Mac platform. I always hoped that this would not happen in my lifetime. I am almost in shock now. I can't believe this is reality. All because of this bastard with his pics. I am extremely pissed, sad, and scared. This guy needs to pay - this is war IMO.
- CoMpX on Oompa Loompa
It does not exploit any security holes. There are zero Mac OS X viruses. It'll be interesting to see which media organizations, if any, pick up on this and run the incorrect story of the first OS X virus. This is what it's come to: making up a OS X virus where none exists.
- MacDailyNews on Oompa Loompa
Oompa Loompa is actually a combination of all three types of malware. First, it is a Trojan horse - an executable hidden inside a file disguised as a graphic file; then it is a virus, as it replicates to other applications on a user's computer; finally, it is a worm when it sends itself via iChat to other users. OSX/Leap-A: a proof of concept piece of malware. Leap-A is merely an attempt to disguise an executable program as an image in effort to trick the recipient into launching the program. Launching a program in Mac OS X requires the user to enter their password, an indicator that should clue most users into the fact that it is not what it appears to be.
- The Mac Observer on Oompa Loompa
Leap-A is not a virus. It is malicious software that requires a user to download the application and execute the resulting file. Apple always advise users to only accept files from vendors and Web sites that they know and trust.
- Apple Inc on Oompa Loompa
Unless Apple faces up to the security issues its users face, its reputation for making secure operating systems, already damaged by its mishandling of these recently discovered vulnerabilities, will be further tarnished.
- John Leyden, The Register May 2004
The reason security research on OS X is so interesting is that Apple take the injudicious move of branching off from tried and true Unix code to create something they're rather reluctant to call Unix anymore. Unix has had a good thirty years to mature and more researchers inspecting it by an order of magnitude. Apple use a closed source model and they're venturing out into new territory where the risk for exploits grows geometrically. And they're carrying with them legacy ideas from the birth of NeXT which predates the birth of the web. And they don't listen.
- The Technological November 2006