About | ACP | Buy Stuff | Industry Watch | Learning Curve | Newsletter | Search | Test Drive
Home » Learning Curve » ACP Guru

The Long Run - Extended Attributes from Tiger to Catalina (2)

Remember the Steve Gambit?

Get It

Try It

You'd basically have to be working at Apple - preferably not in the HIG but working with something substantial - to know the details of XA storage. An XA is bound to a file, but it is not part of the same file descriptor, so to speak. Its on-disk storage has to be separated in some way. Other APIs and file system rules have to be adjusted to accommodate them. Even if they're not part of a standard file data stream, and therefore not governed by the same rules, they have to adhere to those rules. Changing the XAs on a read-only file has to be verboten, for example. But where and how they're stored is another matter, a technical detail mostly kept out of the public domain by secretive Apple. But yes, there must be a hidden parallel stream.

Aside from a handful of exceptions that are likely going to disappear over time, XAs can be whatever you want. Often stored in XML (property list) format, they can contain anything you want, and are commonly named with a 'backwards Internet domain' nomenclature, such as 'com.apple.MyApplication'. Check your Preferences directory and you'll see hundreds of examples. Those XML attributes are often stored in binary XML format, which makes them unreadable for ordinary users, as was not the case with the original XML format.

Today we find XAs such as 'com.apple.rootless' with data such as the Unicode 'KernelExtensionManagement'. And 'SubmissionPolicy'. And 'bug_type'. And 'os_version'. And 'timestamp'. And they're all used, more or less, in an 'enhancement' type of way.

Then we have 'com.apple.quarantine'.

The Apple Quarantine - The Apple Walled Garden

Someone at Apple got the idea early on that a user environment could be likened to a garden, a garden with high virtually insurmountable walls, and that those walls could be used to control everything passing from the outside to the inside and back again. Early versions of this wall may have been implemented through software applications that interfaced directly with the web such as Mail and Safari, but today it's controlled from deeper down. Anything entering the system through known channels is slapped with a Quarantine decal. It's often said and often assumed that this decal wears off over time, but in reality that's simply not true. And the system is as thorough in this respect as possible. 'Archive Utility.app', used to unzip downloads, has dependencies such as the private framework PackageKit that are impregnated with extra code to keep a lookout for Quarantine XAs and slap the same decal on each and every file generated. (One may of course question the wisdom of such extravagance, but take first things first.)

Everything you get onto your system is supposed to be quarantined, meaning it's under observation, meaning it has to suddenly start behaving according to a new set of rules. And it's not just the stuff you download - it's also stuff you generate through so-called 'sandboxed' software. What are your sandboxed applications? Look in ~/Library/Containers. If you have a good file manager, you'll see that a lot of the stuff there (and there's more there than there are bugs on the Mother Ship) is actually symlinks to stuff outside the sandbox. And most if not all of the tens of thousands of files in there will be stamped as 'quarantined'.

The word 'quarantine' dates back to the 1520s, originally meaning 'period of 40 days in which a widow has the right to remain in her dead husband's house'. Yes, it derives from the Latin 'quadraginta', meaning 'forty', which in turn comes from 'PIE' (Proto Indo-European).

By the 1660s, it was used to denote the 'period a ship suspected of carrying disease is kept in isolation', this directly from the Italian 'quarantina giorni'. For Venice had the custom, starting in 1377, of keeping ships from plague-stricken countries waiting off its port for 40 days.

And the later extended sense of a period of forced isolation dates from the 1670s.

But Apple's quarantines never end. As if a suspicious seafarer is brought ashore after a while, yet somehow Venice has enough manpower to keep a spy tailing the new landlubber day and night. This of course is a lot cheaper and a lot easier for a computer company than it would have been for a Venetian doge.

Apple's Gatekeeper 'system' - it's a system rather than a single software module - is Apple's way of keeping things in permanent quarantine not just for forty days, but forever. It's no skin off Apple's back if trusted files and software never get completely 'cleared', their quarantine stamps removed. At the very worst, you're likely to upgrade your computer hardware to something faster, and that only benefits them, not you. Quarantine stamps never go away, not in practice, and certainly not after the customary forty days. They stay forever. You lose in that scenario.

It's really hard to reconcile this clinically paranoid attitude on Apple's part with the earlier justifiable arrogance of how impregnable the Cupertino fortress really was. Tag ploys such as 'Rock Solid Foundation': they were after the Windows switchers back then. (This predates iPhone of course.) And those 'Mac vs PC' adverts - a worldwide campaign with a lot of truth.

(How many adverts were there all told? 66? Over how long a period of time? According to Wikipedia at any rate, the campaign began in May 2006 - the days of Snow Leopard, predating the announcement of iPhone in January 2007 - and continuing until October 2009. The adverts, all directed by Phil Morrison of Epoch Films, were hailed by pundits as the best campaign of the New Millennium.)

Yet that happened there anyway? The Mac was presented as a cool and safe alternative to the Windows PC, which it was, even as the Windows PC got slammed worldwide by a kind of malware attack previously unheard of. Billions were shoveled down the blackholes of emergency security and administrative measures to clean up the rubble after yet another Microsoft outbreak. The distinction was simple, even though it was never named specifically: it was Microsoft Windows against the rest of a largely Unix-oriented world. There are systems out there that are more secure than Apple's Mac, but that hardly mattered, as the jump, from Microsoft to anything not in any way even remotely connected to Microsoft, is a quantum leap, compared to the step up from Apple's BSD-based system to, for example, the supremely bulletproof OpenBSD - a matter of merely stretching one's toes a bit further in the same direction.

There was a time when Mac users lorded it all over PC users in the matter of CPU architecture. The PPC was infinitely superior, the Intel was downright abysmal. And surprise surprise, but this was basically true. And the PPC had another advantage that couldn't be beat or negated, but it wasn't much talked about: typical malware 'shellcode' was almost universally designed to corrupt Intel PCs, not PPC PCs like Apple's, so all that 'bad code' out there, even if it somehow made it onto your system, would simply not work, as your Mac processor didn't speak the same language, couldn't understand what the shellcode was trying to say.

That tack was to change overnight, however, as the clock rate war ensued. Intel processors had to have a faster internal speed: they had so relatively few 'onboard registers' compared to the PPC, so of course they had to keep swapping things in and out of memory all the time, something the PPC rarely needed to consider. Yes, the clock rate hubbub was a complete myth, but perception can be everything, and market share and bottom lines, not unadorned facts, are what motivate stockholders.

IBM came out with their 'Cell' technology, typically brilliant for IBM. First they'd made the world's first CPU that could perform all operations in a single cycle - even memory ops which for Intel took a walloping twelve cycles - and they perfected that technology to the point, as one IBM exec expressed it, that 'we now have the perfect design, all we need to do now is keep upping the clock rate'. And once that had run its course, they switched again, creating the dynamic Cell architecture, which made it possible to squeeze lots of itsy-bitsy mini-PPCs onto a single chip. The sky wasn't the limit - there was no sky.

But the Cell processor was suitable for games consoles only. It ran too hot for laptops. IBM estimated they'd need additional funding of $250 million to produce a cooled-down Cell for portable use. The snag was that no one was interested - save perhaps Apple. But Apple's projected earnings from Mac sales, playing to a platform that put limericks in kernel memory, weren't enough to entice IBM. Apple would never get a Cell processor for their laptops. The move to Intel was forced on them. The move was inevitable.

But how to convince Mac users that this was a good move? Apple marketing took care of it. Suddenly, almost overnight it seemed, the fanbase fell silent regarding the vaguely familiar clock rate war - Intel was of course the way to go, hadn't it always been?

Never mind that Intel processors were woefully vulnerable to malware shellcode - the move was necessary to preserve a perceived PR advantage. Or something.

The point being, of course, that Apple marketing could once again pull the wool over everyone's eyes and get people to loudly proclaim that Intel processors were better when - only months earlier - they'd been screaming the exact opposite.

Prev | Next

About | ACP | Buy Stuff | Industry Watch | Learning Curve | Newsletter | Search | Test Drive
Copyright © Rixstep. All rights reserved.