|Home » Learning Curve
Tom Ferris of Security Protocols is an acknowledged security researcher previously employed by eEye Digital Security, Foundstone Corporation, and the United States Department of Defense. He reported seven security holes to Apple at the beginning of this year. After waiting two months for a response and a fix, he went public with the information.
Tom Ferris, today of Security Protocols and formerly of eEye Digital Security, Foundstone Corporation, and the United States Department of Defense, reported seven 'highly critical' security holes to Apple at the beginning of 2006. You can read about them - and test them - here.
The term 'highly critical' is an industry definition: it describes a flaw requiring no social engineering or user interaction. Naturally it's the highest level of alert that can be issued. The Windows WMF flaw was also 'highly critical'.
What's interesting - and ironic - about these holes is that all you need is to click on Tom's links with your Safari browser (or in some cases download a file and try to open it in Preview) to see what happens. You crash and burn. You don't need more proof from a 'proof of concept' than that: you see it for yourself.
But not long ago Apple released a patch for an earlier security hole - a patch universally denigrated by the world press for being 'lame' and incomplete. The hole is found deep inside Apple's makeover of NeXTSTEP but Apple, boxed into a corner and pushed to do something and quick, created ad hoc safeguards in their own web applications instead. Nothing more has been done; the real hole still remains to this day.
Which meant that users of Camino, Firefox, Thunderbird, and other ISV products had to revert to Apple's own product line if they wanted protection online. For the flaw was not at application level - it was at system level.
But now Tom Ferris demonstrates that even Apple's own web applications - or rather any applications involving graphics rendering - are wide open in a half dozen other ways. Which of course leaves OS X users boxed into a corner of their own - they can't use Camino, Firefox, or Thunderbird and now they can't use Apple's own applications either.
Tom Ferris played by the rules. Advisories are kept on ice for two weeks to give the vendor the opportunity to respond with a fix. This is the industry standard. If the vendor does not respond within two weeks, the reports should go public so people can protect themselves. Tom Ferris didn't wait two weeks - he waited two months.
Apple had ample time to address the issues and did nothing.
The disclosures by Tom Ferris are being wildly discussed across the Internet. Some users are justifiably concerned; others think it's only more 'Rob Enderle FUD' and nothing to worry about as no exploits have yet surfaced.
The POCs created by Tom Ferris show where the holes are - they don't attempt to hijack computers. Tom Ferris didn't feel it necessary - or even good manners - to go that far. And he didn't have to. The principle for creating an exploit is the same: if you can overrun the stack or the heap you can most likely hijack the application. All you need is some good shell code and a bit of sweat and hair pulling and you're 'in like Flynn'. For bad code always means exploitable code.
Delaying an advisory to craft an exploit of a security hole is foolhardy anyway, against the best interests of both the vendor and users of the product. There is no guarantee for Tom Ferris that he is the first one to find the holes. If the black hats out there have already found them, they're certainly not talking and time is of the essence.
Most of the planet were caught with their knickers down when the Love Bug hit in May 2000; many people had warned for years that things like this could happen, but no one listened. At the end of the disaster damages were tallied at five and one half billion US dollars.
OS X suffers from endemic flaws, most of which are directly attributable to the futile attempt to marry the standalone architecture of the classic Apple 'beige box' with the standards compliant NeXTSTEP. They've been cause for concern to software engineers and the events of 2006 show the concern was well founded.
Apple's classic retort for most any security advisory is users should not run untrusted software. While this is impossible to implement in practice - and just the kind of attitude that got the planet screwed up with Windows flaws in the first place - it doesn't apply to the current situation anyway: the slew of holes Tom Ferris published require no user interaction.
Any website at all can embed malicious images; an OS X user surfing the web will by definition not know when disaster hits; the very fact that Apple's own web applications can be attacked in this way means no one is going to be safe. If the holes are 'out there' then anyone can work on them and craft exploits for them.
The following was found posted at MacFixIt's forums.
All I can say is that I personally am not concerned with some buffer overflow sensitivities that allow a potential threat. Until someone actually capitalises on the potential, it seems to me that continuing to exercise an intelligent awareness remains the best course of action.
A comment on this attitude from the same thread:
Oh that's reassuring. Yeah, let's wait for some users' machines to get exploited.
At the end of the day, the only 'intelligent awareness' is to wait and see - and hope - others get hurt first.