Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/forget-the-white-house-sideshow-intel-must-decide-what-it-wants-to-be.23389/page-7
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021770
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

Forget the White House Sideshow. Intel Must Decide What It Wants to Be.

Worse I fear, is that the company culture within Intel is possibly incapable of making such a radical turn as to become a service company (Foundry).
For good or ill, I think Lip-Bu Tan has pretty much spelled the end of the Intel culture of old. Intel hasn't had a bloodletting like this since Grove and Moore moved out of DRAM in 1985. Much like Julius Caesar burning the bridge after crossing the Rubicon, when you lay off 40% of your workforce over a 2 year period there is no going back. Now Intel's only hope is to find a way to stay laser focused and kill off the bureaucracy to move quickly. It will be abundantly clear which remaining managers are obstacles and which get things done. I expect those that are obstacles to be shown the door.
 
For good or ill, I think Lip-Bu Tan has pretty much spelled the end of the Intel culture of old. Intel hasn't had a bloodletting like this since Grove and Moore moved out of DRAM in 1985. Much like Julius Caesar burning the bridge after crossing the Rubicon, when you lay off 40% of your workforce over a 2 year period there is no going back. Now Intel's only hope is to find a way to stay laser focused and kill off the bureaucracy to move quickly. It will be abundantly clear which remaining managers are obstacles and which get things done. I expect those that are obstacles to be shown the door.
Sometimes the path to success only lies on the other side of a good blood letting ;)
 
I agree with you. Intel, like many other companies before it, has become a victim of its own success. What worked, worked so well, it could never be questioned. When it quit working, there was simply no appetite to even entertain the changes that had to be done. Worse I fear, is that the company culture within Intel is possibly incapable of making such a radical turn as to become a service company (Foundry).

I would bet good money that you are correct; however, I doubt that we will find out. It is my current opinion that dumping money into the same machine .... using the same cogs and mechanics as it had before, will simply suck up that money like water poured on the desert sand.

I also think that the Intel foundry has become the boogie man inside Intel (someone correct me if I am wrong). It seems like the losses are largely blamed on the foundry results.

Where else should the blame be placed? I am likely not qualified to say. My guess is that there were simply too many risky bets placed on too many endeavors that didn't play out favorably. I kind of see this in the foundry and in the processor architecture. It's not new either. I still recall the P4, Rambus and Itanium days. Like I said. Risky.

Still, Intel held the world in its hands for decades. As spectacularly as if failed (P4), it often recovered with a vengeance (Core 2). In the days of 14nm (and before) Intel dominated the lithography world sometimes holding a 2 node shrink advantage over all others (IIRC). It's almost impossible to imagine that they WOULDN'T keep holding to the recipe that kept them on top for decades.

IBM comes to mind. Anyone buy an IBM PC lately ;). Perhaps this is a good example of a company utterly missing the boat, burning to the ground, and rising from the ashes. Surely IBM will be taught to perspective MBA students for decades to come. Intel may also become a subject for these students. (Off topic) I am very sure Tesla will as well :).
The story is too long 
Is Intel split or not? or is Intel stupid?
Intel has no value
 
Come to think of it, I've seen a rather insane remark that Moore's law has been promoted by Apple...
It's not like Apple promoted it.
but Intel is no longer Moore's law
 
The story is too long 
Is Intel split or not? or is Intel stupid?
Intel has no value
Just my opinion, but Intel will be split eventually. The vertical integration model is not financially viable at the volumes Intel has IMO.

"Intel" has done things that are arguably "stupid". I would never state that an entire company is simply "stupid".

Intel has a great deal of value. Anyone that believes otherwise is mistaken. Intel's current state is a result of poor strategy and risk management IMO.
 
Sometimes the path to success only lies on the other side of a good blood letting ;)

As Churchill and many others said: Never let a good crisis go to waste

Nice autopsy report by ARPU:
For forty years, Intel was the undisputed king of computing. Today, its empire is in a state of collapse. But the common narrative—that it was outmaneuvered by rivals like AMD and Nvidia—misses the bigger, more painful truth. Intel's greatest enemy was itself. This ARPU Deep Dive is a forensic autopsy of the catastrophic, self-inflicted wounds and strategic arrogance that led to Intel's decline.


 
Last edited:
I agree with you. Intel, like many other companies before it, has become a victim of its own success. What worked, worked so well, it could never be questioned. When it quit working, there was simply no appetite to even entertain the changes that had to be done. Worse I fear, is that the company culture within Intel is possibly incapable of making such a radical turn as to become a service company (Foundry).

I would bet good money that you are correct; however, I doubt that we will find out. It is my current opinion that dumping money into the same machine .... using the same cogs and mechanics as it had before, will simply suck up that money like water poured on the desert sand.

I also think that the Intel foundry has become the boogie man inside Intel (someone correct me if I am wrong). It seems like the losses are largely blamed on the foundry results.

Where else should the blame be placed? I am likely not qualified to say. My guess is that there were simply too many risky bets placed on too many endeavors that didn't play out favorably. I kind of see this in the foundry and in the processor architecture. It's not new either. I still recall the P4, Rambus and Itanium days. Like I said. Risky.

Still, Intel held the world in its hands for decades. As spectacularly as if failed (P4), it often recovered with a vengeance (Core 2). In the days of 14nm (and before) Intel dominated the lithography world sometimes holding a 2 node shrink advantage over all others (IIRC). It's almost impossible to imagine that they WOULDN'T keep holding to the recipe that kept them on top for decades.

IBM comes to mind. Anyone buy an IBM PC lately ;). Perhaps this is a good example of a company utterly missing the boat, burning to the ground, and rising from the ashes. Surely IBM will be taught to perspective MBA students for decades to come. Intel may also become a subject for these students. (Off topic) I am very sure Tesla will as well :).
IBM, Kodak, Nokia, Intel -- all companies who were once totally dominant in their field but stuck stubbornly to their now-outdated business model/technology when the world moved on...
 
all companies who were once totally dominant in their field but stuck stubbornly to their now-outdated business model/technology when the world moved on...
is it really the business model though? cause even with TSMC you need to have a IDM like relationship you need to work closely with them like a Design and Manufacturer would. It just both are seperated it has more to do with Intel's stubborn decisions and or them not adapting to change in the tech.
 
IBM, Kodak, Nokia, Intel -- all companies who were once totally dominant in their field but stuck stubbornly to their now-outdated business model/technology when the world moved on...
Kodak is a special case which isn't really comparable to IBM, Nokia, and Intel. Kodak was and is a chemical company. Photography and film-making went digital, and that was completely outside their expertise. Digital cameras (which Kodak tried) of any sort were a small fiscal fraction of the entire ecosystem of film manufacturing and processing that Kodak dominated, as much as Intel dominated CPUs for a while. The equivalent of what Kodak did for film would be chip fabrication in the digital world, and Kodak did not see itself doing that. I understand why. To me, it would have been an interesting question, like why not acquire a nascent TI or similar company, but let's face it, that would be an MBA case study for the ages if Kodak attempted it, whether they succeeded or failed.

I don't know enough about Nokia to comment.

Intel and IBM are well understood problem cases by comparison. Especially IBM. The Z-series temptation is irresistible, and it is easy to see why. The obvious questions like, "Why did IBM divest from IBM Microelectronics?", or "Why did IBM sell its X-series server business and its PC business to Lenovo?" are still pondered. I recently pointed out some facts to a person in the semi field who should definitely know better. Lenovo's entire corporate revenue and net profit for 2024 were $69.1B and $1.4B. IBM's 2024 revenue and net profit were $62.8B and $6B. That's why the inventor of the PC no longer makes them, or x86 servers.
 
Kodak is a special case which isn't really comparable to IBM, Nokia, and Intel. Kodak was and is a chemical company. Photography and film-making went digital, and that was completely outside their expertise. Digital cameras (which Kodak tried) of any sort were a small fiscal fraction of the entire ecosystem of film manufacturing and processing that Kodak dominated, as much as Intel dominated CPUs for a while. The equivalent of what Kodak did for film would be chip fabrication in the digital world, and Kodak did not see itself doing that. I understand why. To me, it would have been an interesting question, like why not acquire a nascent TI or similar company, but let's face it, that would be an MBA case study for the ages if Kodak attempted it, whether they succeeded or failed.

I don't know enough about Nokia to comment.

Intel and IBM are well understood problem cases by comparison. Especially IBM. The Z-series temptation is irresistible, and it is easy to see why. The obvious questions like, "Why did IBM divest from IBM Microelectronics?", or "Why did IBM sell its X-series server business and its PC business to Lenovo?" are still pondered. I recently pointed out some facts to a person in the semi field who should definitely know better. Lenovo's entire corporate revenue and net profit for 2024 were $69.1B and $1.4B. IBM's 2024 revenue and net profit were $62.8B and $6B. That's why the inventor of the PC no longer makes them, or x86 servers.
Kodak stuck to film and film-based cameras and missed the transition to digital, even though they internally developed the first digital camera it was written off as pointless -- why would anyone give up super-high-quality film for terrible quality digital with a big camera and appalling battery life? They didn't see how advancing technology and changing markets would change this completely.

Nokia dominated the world mobile phone market for many years -- I worked for an IDM at the time, and the "holy grail" was winning Nokia phone business for chips. They completely missed the smartphone market driven by Apple, basically questioning why anyone would want a phone with a massive screen and no keypad and poor battery life. They didn't see how advancing technology and changing markets would change this completely.

IBM dominated mainframe computers, dismissing the rise of minicomputers (DEC) and then microcomputers -- why would anyone (business of individual) want a tiny feeble unreliable computer of their own when they can just access one of our super-capable "big iron" ones? They didn't see how advancing technology and changing markets would change this completely.

I'd say the parallel with Intel is pretty clear, it's just that they're not as dead as Kodak/Nokia/IBM -- well, not yet anyway... ;-)
 
Kodak stuck to film and film-based cameras and missed the transition to digital, even though they internally developed the first digital camera it was written off as pointless -- why would anyone give up super-high-quality film for terrible quality digital with a big camera and appalling battery life? They didn't see how advancing technology and changing markets would change this completely.
Not correct. Kodak saw digital cameras, both still and video, consumer and commercial coming early on. The problem was that none of those markets were significant enough in revenue or gross margin to replace film manufacturing and developing. Not even close. I used to know people to who worked there in that era (I went to school with several, and a couple of relatives). The Intel transitions are mere tremors compared to Richter 9 quake Kodak saw coming for years, and the alternatives were all insufficient or terrible.
Nokia dominated the world mobile phone market for many years -- I worked for an IDM at the time, and the "holy grail" was winning Nokia phone business for chips. They completely missed the smartphone market driven by Apple, basically questioning why anyone would want a phone with a massive screen and no keypad and poor battery life. They didn't see how advancing technology and changing markets would change this completely.
This is the story I've heard, but I've never met any Nokia insiders, so what "really happened" seems like a mystery to me.
IBM dominated mainframe computers, dismissing the rise of minicomputers (DEC) and then microcomputers -- why would anyone (business of individual) want a tiny feeble unreliable computer of their own when they can just access one of our super-capable "big iron" ones? They didn't see how advancing technology and changing markets would change this completely.
I don't agree at all. IBM designed and manufactured minicomputers, the System 36 and the System 38. The System 38 in particular was very innovative, from the perspectives of hardware and software computer architecture. IMO, more technically sophisticated than the VAX or the Data General systems. The underlying problem was IBM's sales force and support organizations were structured to support mainframe sales and service, not minicomputer customers, and minicomputer sales diluted their gross margins. And that was certainly true for any X86-based products.
I'd say the parallel with Intel is pretty clear, it's just that they're not as dead as Kodak/Nokia/IBM -- well, not yet anyway... ;-)
I don't agree with this either. Intel is a study in incompetent senior management, including the BoD. End of story. They knew all about what was going on, and just got hooked on an x86 fix and internally-focused fabs. Watching from the inside and the outside has been akin to torture.
 
I don't agree with this either. Intel is a study in incompetent senior management, including the BoD. End of story. They knew all about what was going on, and just got hooked on an x86 fix and internally-focused fabs. Watching from the inside and the outside has been akin to torture.
From the outside, it appears to be more of a "This is how we do it" problem.

By basing their chip leadership on their ability to maintain foundry dominance, Intel set the course for financial failure.

One could argue that Intel would have maintained chip volume dominance if (fill in several blanks); however, I would argue that it was never a realistic possibility for Intel to maintain chip dominance in EVERY kind of high volume chip produced ..... and therefore inevitable that other foundry services would eventually have many times their volume to amortize the rising cost of equipment over.

This is just my current take on the matter from a strictly business and finance standpoint.

I do agree on the spectacle being torture to watch.

As for IBM's case, I think that they were relegated to obscurity due to mis-steps, and came back as something completely different. One could argue that this is what Kodak needed to do as well.... but failed to do so.

Intel on the other hand, need only split off the foundry and refocus their considerable design capabilities on using industry standard tools and processes. They need to stop trying to brute force their way to dominance since even if this method yields superior performance (which it hasn't due to other issues IMO), it would still lead to a financial failure.

I remember when I was a very young and naive engineer, the answer was always "what's the best way" to accomplish a specific task without regard to how expensive it was to make, how hard to maintain, or how difficult it was to debug.

Designing with all this in mind required a few self-inflicted grey hairs to be earned ;).

I feel like Intel spent so much time on top, that some of this thinking got lost amid their near absolute monopoly. I know from my own experience in management, that it is VERY difficult to make big changes when a company is making great margins. Convincing people that what you are currently doing is a mistake .... while that method is making huge profit, is nearly impossible at any company.
 
From the outside, it appears to be more of a "This is how we do it" problem.

By basing their chip leadership on their ability to maintain foundry dominance, Intel set the course for financial failure.

One could argue that Intel would have maintained chip volume dominance if (fill in several blanks); however, I would argue that it was never a realistic possibility for Intel to maintain chip dominance in EVERY kind of high volume chip produced ..... and therefore inevitable that other foundry services would eventually have many times their volume to amortize the rising cost of equipment over.

This is just my current take on the matter from a strictly business and finance standpoint.

I do agree on the spectacle being torture to watch.

As for IBM's case, I think that they were relegated to obscurity due to mis-steps, and came back as something completely different. One could argue that this is what Kodak needed to do as well.... but failed to do so.

Intel on the other hand, need only split off the foundry and refocus their considerable design capabilities on using industry standard tools and processes. They need to stop trying to brute force their way to dominance since even if this method yields superior performance (which it hasn't due to other issues IMO), it would still lead to a financial failure.

I remember when I was a very young and naive engineer, the answer was always "what's the best way" to accomplish a specific task without regard to how expensive it was to make, how hard to maintain, or how difficult it was to debug.

Designing with all this in mind required a few self-inflicted grey hairs to be earned ;).

I feel like Intel spent so much time on top, that some of this thinking got lost amid their near absolute monopoly. I know from my own experience in management, that it is VERY difficult to make big changes when a company is making great margins. Convincing people that what you are currently doing is a mistake .... while that method is making huge profit, is nearly impossible at any company.

Do you remember when Intel said that the fabless model is collapsing? This was Intel's Pearl Harbor moment: "I fear all we have done is to awaken a sleeping giant and fill him with a terrible resolve," . The rest, as they, say is history.

 
Switching topic order to Intel then IBM:
I don't agree with this either. Intel is a study in incompetent senior management, including the BoD. End of story. They knew all about what was going on, and just got hooked on an x86 fix and internally-focused fabs. Watching from the inside and the outside has been akin to torture.
Agreed, especially the torture bit, and going deeper, I see routine, extreme high level engineering incompetence that eventually extended to and destroyed the leading edge logic fabrication golden goose with the failed transition to 10 nm/Intel 7.

Off the top of my head, older examples include their RISC attempts, Itanium, Rambus RDRAM which resulted in two million part recalls, one of which was motherboards Dell was about to ship, front side bus vs. ccNUMA, and AMD64. Netburst (P4) was at minimum a fabrication failure, was predicated as a "marketing architecture" on getting parts up to 10 GHz just before Dennard scaling failed.

No surprise that a company with such sustained horrible high level technical management would ignore their teething problems with 14 nm and fail for years with the next node. I also see this as illuminating the IDM or not question:

Would Intel be succeeding well enough today if they hadn't failed so hard with 10 nm, and BK hadn't gutted their verification function? Those together trashed their x86 business line for a long time, which we believe after their exit from memory was the only thing they were institutionally able to do. It's a path they couldn't take, which leads to:

Another way to look at this is risk management, the success of the IDM model depends on sufficient success at both design and fabrication. Whatever you can say about AMD's fabrication problems before and after the spinoff, not being locked into Global Foundries allowed them to eventually move to pure play foundry TSMC. A company can be good at only so many things; for AMD, see their GPGPU software problems.
I don't agree at all. IBM designed and manufactured minicomputers, the System 36 and the System 38... The underlying problem was IBM's sales force and support organizations were structured to support mainframe sales and service, not minicomputer customers, and minicomputer sales diluted their gross margins.....
That structure was a big, historical change for IBM. Their punched card computing history was much more in the minicomputer style, and their history of minicomputers goes way back further.

The 1954 not a mainframe IBM 650 was the first mass produced computer and at the low end was followed by the very clever 1959 IBM 1620 scientific computer. Which for the first version used a great deal of its budget on core memory, which IBM was very good at vs. the drum memory used in the 650. Thus the CADET internal name was backronymed into Can't Add, Doesn't Even Try, math was done with tables in memory rather than logic.

According to Wikipedia citing memories of the people doing it, the other half of the phrase "space cadet" was the code name for the very successful also released in 1959 IBM 1401 for business.

Which continued to be produced after the System/360 consolidation of its previous numerous computer lines, and was the father of the System 3, which lead to four System 3N systems, the last of those the ambitious System 38 being the ancestor of the AS/400, now the "i" line.

On the scientific end, the CADET which according to Wikipedia sold about 2,000 units was followed by the IBM 1130 based on System/360 technology, about 10,000 sold. And it was the introduction to programming for a very large number of people including myself at the end of its run. A lot of cleverness was used to make it as inexpensive as possible, like its entry level very slow printer. Wikipedia says it was followed by a System/7 and Series/1. all three from Boca Raton, Florida, home of the IBM PC.

As a programming and systems type, I see the major IBM inflection point as the internal politics that accompanied the "bet the company" System/360 effort. Which was so difficult, from hardware production to software, IBM swore to never try that again, so it eventually became today's high end for them "z" series.

I believe that environment allowed the decision to eschew virtual memory to become iron dogma, all the way through the first set of Series/370 models. That immediately lost them almost all of the higher education market, a lot of which moved to DEC's first mainframe lines, the PDP-6/10/20s with a dead end 18 bits of 36 bit words address space, or 1 MB (IBM's pre-64 bit mainframes were 31 bits of 8 bit bytes, or 2 GB). Along with very bad systems software they funneled their high end into an eventual big business ghetto with no mind share.

At the lowest end, IBM utilizing the Boca Raton crew launched the Intel 8088 CPU based IBM PC. But except for the minor XT revision, the company never really understood the importance of compatibility. They were able to once more move the whole industry with the 286 based IBM AT. What followed was very ugly, Compaq didn't care about IBM's internal infirmaries and launched the first major 386 desktop, and see Micro Channel, and its echoes at DEC following the success of the Unibus and Q-bus.

IBM having bluewashed Boca Raton realized its design cycle was about 4 years long compared to the industry's 2 years or less, so they designed a high speed 16/32 bus and tried the old trick of initially only using and licencing it at half speed. Other companies were not so constrained.
 
Switching topic order to Intel then IBM:
Agreed, especially the torture bit, and going deeper, I see routine, extreme high level engineering incompetence that eventually extended to and destroyed the leading edge logic fabrication golden goose with the failed transition to 10 nm/Intel 7.

Off the top of my head, older examples include their RISC attempts, Itanium, Rambus RDRAM which resulted in two million part recalls, one of which was motherboards Dell was about to ship, front side bus vs. ccNUMA, and AMD64. Netburst (P4) was at minimum a fabrication failure, was predicated as a "marketing architecture" on getting parts up to 10 GHz just before Dennard scaling failed.

No surprise that a company with such sustained horrible high level technical management would ignore their teething problems with 14 nm and fail for years with the next node. I also see this as illuminating the IDM or not question:

Would Intel be succeeding well enough today if they hadn't failed so hard with 10 nm, and BK hadn't gutted their verification function? Those together trashed their x86 business line for a long time, which we believe after their exit from memory was the only thing they were institutionally able to do. It's a path they couldn't take, which leads to:

Another way to look at this is risk management, the success of the IDM model depends on sufficient success at both design and fabrication. Whatever you can say about AMD's fabrication problems before and after the spinoff, not being locked into Global Foundries allowed them to eventually move to pure play foundry TSMC. A company can be good at only so many things; for AMD, see their GPGPU software problems.That structure was a big, historical change for IBM. Their punched card computing history was much more in the minicomputer style, and their history of minicomputers goes way back further.

The 1954 not a mainframe IBM 650 was the first mass produced computer and at the low end was followed by the very clever 1959 IBM 1620 scientific computer. Which for the first version used a great deal of its budget on core memory, which IBM was very good at vs. the drum memory used in the 650. Thus the CADET internal name was backronymed into Can't Add, Doesn't Even Try, math was done with tables in memory rather than logic.

According to Wikipedia citing memories of the people doing it, the other half of the phrase "space cadet" was the code name for the very successful also released in 1959 IBM 1401 for business.

Which continued to be produced after the System/360 consolidation of its previous numerous computer lines, and was the father of the System 3, which lead to four System 3N systems, the last of those the ambitious System 38 being the ancestor of the AS/400, now the "i" line.

On the scientific end, the CADET which according to Wikipedia sold about 2,000 units was followed by the IBM 1130 based on System/360 technology, about 10,000 sold. And it was the introduction to programming for a very large number of people including myself at the end of its run. A lot of cleverness was used to make it as inexpensive as possible, like its entry level very slow printer. Wikipedia says it was followed by a System/7 and Series/1. all three from Boca Raton, Florida, home of the IBM PC.

As a programming and systems type, I see the major IBM inflection point as the internal politics that accompanied the "bet the company" System/360 effort. Which was so difficult, from hardware production to software, IBM swore to never try that again, so it eventually became today's high end for them "z" series.

I believe that environment allowed the decision to eschew virtual memory to become iron dogma, all the way through the first set of Series/370 models. That immediately lost them almost all of the higher education market, a lot of which moved to DEC's first mainframe lines, the PDP-6/10/20s with a dead end 18 bits of 36 bit words address space, or 1 MB (IBM's pre-64 bit mainframes were 31 bits of 8 bit bytes, or 2 GB). Along with very bad systems software they funneled their high end into an eventual big business ghetto with no mind share.

At the lowest end, IBM utilizing the Boca Raton crew launched the Intel 8088 CPU based IBM PC. But except for the minor XT revision, the company never really understood the importance of compatibility. They were able to once more move the whole industry with the 286 based IBM AT. What followed was very ugly, Compaq didn't care about IBM's internal infirmaries and launched the first major 386 desktop, and see Micro Channel, and its echoes at DEC following the success of the Unibus and Q-bus.

IBM having bluewashed Boca Raton realized its design cycle was about 4 years long compared to the industry's 2 years or less, so they designed a high speed 16/32 bus and tried the old trick of initially only using and licencing it at half speed. Other companies were not so constrained.

Some might dismiss these failures or mistakes as 20/20 hindsight. But after so many serious missteps by a long list of Intel’s CEOs, board of directors, managers, and engineers, I have to ask: is this the inevitable fate of the IDM business model? If not, then someone must explain why Intel has shown such a persistent pattern of incompetence and poor leadership over the past 20 to 30 years.

Is Intel a magnet that only attracts incompetents, scammers, and crooks who made bad decisions - for more than 20 years? I don't believe so.
 
[...] is this the inevitable fate of the IDM business model? If not, then someone must explain why Intel has shown such a persistent pattern of incompetence and poor leadership over the past 20 to 30 years.

Is Intel a magnet that only attracts incompetents, scammers, and crooks who made bad decisions - for more than 20 years? I don't believe so.
We'd need to look at other IDMs; AMD's history going back even more decades could support the first thesis. But not TI I think; while they made a huge mistake trying to shift into a consumer goods company, they're remarkable for how many generations of technology they've mastered well enough, going back to discrete transistors.

Or consider that bad management is very common, and making chips is particularly unforgiving. "Robert X. Cringely" had a fascinating thesis and illustrating anecdote in his Accidental Empires, that it's hard for one man to destroy big companies, especially quickly I think? Except in tech.

So, all of a sudden Intel's yields dived and this was traced to wafer contamination. But the manufacturer swore up and down they were good when shipped out, so eventually Intel had someone follow a shipment from that manufacturer all the way to the fabs. Who found a new shipping clerk was unsealing and counting the bare wafers on his desk to make sure Intel wasn't getting shortchanged!

As far as I know a very small number of people at the top made a lot of these cited bad Intel decisions and stuck to them for far too long, even as engineers below told the higher ups about the problems. To no avail, with an extreme example of Craig Barret's "I don't pay you to bring me bad news, I pay you to go make my plans work out" when told about how there was a growing gap between what the Itanium could actually do and 32 bit x86 CPUs.

Which AMD filled with AMD64 and ccNUMA, Intel eventually following. Which brings up another huge issue: these are long time frame processes, both designing and implementing new CPUs and fab nodes. You start, disaster only becomes tangible much later, and Intel sure is good at ignoring disasters for a long time, with 10 nm being another example. Or see Lip-Bu Tan saying dropping SMT was a mistake, and Intel entrails readers don't think that can be addressed until after Diamond Rapids, so some time after 2026?

One compounding 10 nm problem was the company's response or lack thereof. Cannon Lake's release was proof something was terribly wrong in management. This was clear to everyone by virtue of it having only one underwhelming SKU with a huge, disabled iGPU that they nonetheless provided drivers for. No one could tell the emperor about his new clothes and be heard.

So that's three IDMs I have some knowledge of including using their parts, what about others historically or today? Memory manufactures are IDMs, and there's there's only three left for DRAM outside of China. Samsung Electronics would ne an example which is lagging or going bad in both DRAM and logic.
 
Do you remember when Intel said that the fabless model is collapsing? This was Intel's Pearl Harbor moment: "I fear all we have done is to awaken a sleeping giant and fill him with a terrible resolve," . The rest, as they, say is history.

[chuckles]Well .... Seems they couldn't have been more incorrect. Still, among many forums, and many voices, there are still plenty of people that believe all Intel needs to do is REALLY commit to more spending, more processes, and more fab development and they will be back where they once were.

For me, this is the text book definition of insanity (doing the same thing, expecting different results).

Back in 2012, it is understandable why Intel would be saying this. As I said, what was working was working so well that there could be no tolerance of an opposing voice.

"The rest, as they say, is history" --- Indeed.

So that's three IDMs I have some knowledge of including using their parts, what about others historically or today? Memory manufactures are IDMs, and there's there's only three left for DRAM outside of China. Samsung Electronics would ne an example which is lagging or going bad in both DRAM and logic.
Memory is much less reliant on design capability or cutting edge fab, and is largely a commodity. If anything, it is mostly dependent on cost.

I think this is quite different from processors ;).
 
We'd need to look at other IDMs; AMD's history going back even more decades could support the first thesis. But not TI I think; while they made a huge mistake trying to shift into a consumer goods company, they're remarkable for how many generations of technology they've mastered well enough, going back to discrete transistors.

Or consider that bad management is very common, and making chips is particularly unforgiving. "Robert X. Cringely" had a fascinating thesis and illustrating anecdote in his Accidental Empires, that it's hard for one man to destroy big companies, especially quickly I think? Except in tech.

So, all of a sudden Intel's yields dived and this was traced to wafer contamination. But the manufacturer swore up and down they were good when shipped out, so eventually Intel had someone follow a shipment from that manufacturer all the way to the fabs. Who found a new shipping clerk was unsealing and counting the bare wafers on his desk to make sure Intel wasn't getting shortchanged!

As far as I know a very small number of people at the top made a lot of these cited bad Intel decisions and stuck to them for far too long, even as engineers below told the higher ups about the problems. To no avail, with an extreme example of Craig Barret's "I don't pay you to bring me bad news, I pay you to go make my plans work out" when told about how there was a growing gap between what the Itanium could actually do and 32 bit x86 CPUs.

Which AMD filled with AMD64 and ccNUMA, Intel eventually following. Which brings up another huge issue: these are long time frame processes, both designing and implementing new CPUs and fab nodes. You start, disaster only becomes tangible much later, and Intel sure is good at ignoring disasters for a long time, with 10 nm being another example. Or see Lip-Bu Tan saying dropping SMT was a mistake, and Intel entrails readers don't think that can be addressed until after Diamond Rapids, so some time after 2026?

One compounding 10 nm problem was the company's response or lack thereof. Cannon Lake's release was proof something was terribly wrong in management. This was clear to everyone by virtue of it having only one underwhelming SKU with a huge, disabled iGPU that they nonetheless provided drivers for. No one could tell the emperor about his new clothes and be heard.

So that's three IDMs I have some knowledge of including using their parts, what about others historically or today? Memory manufactures are IDMs, and there's there's only three left for DRAM outside of China. Samsung Electronics would ne an example which is lagging or going bad in both DRAM and logic.

To be specific, is the advanced logic IDM industry in a hopeless situation with no path forward? Only two companies are still holding on, Intel and Samsung, and both are in a difficult situation.
 
I don't agree with this either. Intel is a study in incompetent senior management, including the BoD. End of story. They knew all about what was going on, and just got hooked on an x86 fix and internally-focused fabs. Watching from the inside and the outside has been akin to torture.
All these stories, except for perhaps the the Nokia one, have the same underlying theme - one or more disruptive new technology/market combos arising and the incumbent failing to figure out how to deliver products to serve those new markets in a way that met the margin / volume / channel needs of their existing successful business.

An incumbent company encountering a disruptive technology and market often finds itself in a precarious position, famously described by Clayton Christensen as the "Innovator's Dilemma". These well-managed, successful companies often fail to adapt because their existing business models, which focus on pleasing their most profitable customers, are fundamentally ill-equipped to respond to a new market built around a simpler, cheaper technology.
 
Back
Top