|Home » Learning Curve
Don't Subtract People!
Richard Mendes on the Intel cutbacks.
Companies always seem to think the solution to their profitability lies in slashing costs, usually by slashing people.
Idiots. Worse: imitative idiots.
Get it straight: you increase profitability by providing better products and services than your competitors. You can cut costs only so much, then you're slicing muscle and bone, not fat! You need to increase revenue and profit margins, which you do by either selling more for less, or by creating products for which people will pay a premium!
The cost-cutting solution rarely works by itself. You need a company that has grown hugely fat and happy to derive any lasting benefit. You're just getting a one-time shot in the arm by throwing 10,000 people out of work. Of course, that's how senior executives make their numbers and obtain their fat, and largely unearned bonuses. 'Après moi le deluge!' By which time these clowns are enjoying their early retirements.
At least Intel has it right in that they're starting with the management, the people who ultimately are responsible for the company's problems. The workers are just doing what they're hired to do. However, I'm sure that Intel is going to dump a layer of management that is just doing what they have been tasked to do. Where they should start pruning is from the top, 'cause that's where you find the idiots who dug the hole.
But they probably just hired some MBA consultants armed with spreadsheets to solve their problems. Result: a largely unintelligible report that makes the consultants look smart (so they can show that they earned their fees and impress prospective clients), and a formulaic solution to their problems.
Guys, if the competition (AMD) is beating your ass, cut your prices to the bone and take some of those billions you've piled up and put them into R&D.
IBM, Motorola and Apple did it by adapting IBM's Power RISC processors. IBM is zipping along with Power and Cell technologies, while Intel is mired down with an architecture which has been obsolete for years. Yes, they've goosed up performance by shrinking circuitry and finding ways to cool the resulting processors (thermal grease, anyone?) but there are limits to that approach. Dual core processors have allowed them to extend the current technology, but where do you go from there?
64-bit, anyone? More registers? RISC, which has proven itself as a technology? How about building in some extra (but costly) circuitry to emulate the current line in hardware/firmware until the transition to RISC is complete?
Add value! Don't subtract people.
Richard Mendes is a former systems analyst and marketing consultant specialising in IBM and DEC rollouts.