Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/index.php?threads/ranked-america%E2%80%99s-largest-semiconductor-companies.18174/page-2
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021370
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

Ranked: America’s Largest Semiconductor Companies

I guess I need to update this one:

 
Not sure where you're getting your data from, but AMD has definitely not lapped them in CPUs, particularly design. In fact, the fastest single-threaded performance, the most difficult metric, is owned by Intel. But a decent amount. Last I checked, Intel owns over 80% of the mobile and desktop markets. Lapping, from behind?
I was talking about various design aspects, not current market share.
Servers they still probably have the Edge though, mainly because of the process tech they are on. Their architecture is not any better, but their node is not only a lot more dense, but also more power efficient.
I disagree that AMD's architecture is not any better.
With regards to Intel chips in iPhones, they weren't going to be ARM based, as far as I know. I've never heard that said, but I'm not 100% on it. Apple wanted Intel chips, Intel said they couldn't make them that cheap, and that was that. But, if you have information showing they were ARM based processors, please let me know, so I know better. But, until then, I'm going to assume they are x86 based.
Intel was working on a mobile x86 for a while, but nothing came of it. I'll leave out the details. The Apple chip in question was Arm-based, which was also in the link I chose.
Actually, their designs haven't been very bad, they were just delayed because of 10nm being a disaster, and the designs for it being delayed so long.
The designs were not competitive even if 10nm was on time. AMD got a bigger head start because of 10nm's lateness, but there's more to it than that.
Krzanich, as CEO at the time, wasn't responsible in any way for their fab problems? Not at all? How about the contra fiasco? You are dead on, x86 on phones wasn't a great idea, it made no sense at all. But, Krzanich didn't see it that way, he tried to force fit it. I remember at the time thinking what an idiot this guy was, and how it was going to bite him. Sure enough it did. Intel's problems have not been about design, it's been about their fabs being so delayed, the designs have been delayed.
I didn't say BK didn't play a role in Intel's 10nm problems. But he had little to with the design problems, that wasn't his expertise at all.
GPUs came from nowhere to now having 4% market share, and finally getting respect in the industry. But, they weren't started on Pat's watch anyway. I mean, they're not crazy successful, but in terms of progress, it's been extremely fast the last year or so. They were delayed, and then had horrible drivers, and now have better drivers and are gaining market share. Realistically, that's pretty good, but again, this was all started before he got there. A long time before.
Intel never considers a 4% of anything a success. Maybe for a start-up that's acceptable.
Optane again was NOT designed or marketed during his watch. He killed it. Was that a mistake? I'm not sure, if you think it was, then you could blame it on him. It was a great technology, but, I guess the improvements didn't warrant the cost, and most people were satisfied with SSD performance and endurance.
The Optane cancellation was an embarrassing screw-up. And it keeps on going. The VP of the Optane group is still on LinkedIn reminding everyone he still has Optane SSDs to sell. What a mess. As for Optane's advantages and problems, that's a long discussion.
I have no idea what you mean by them cleaning up their IA64 architecture. IA-64 was discontinued years ago, unless you know something I don't. I'd love to know it, because I was always fascinated by that architecture, and am kind of sad it's gone away. But, I can easily see why too, I just don't know why they couldn't get it to work, but I think the idea the compiler would be smart enough to schedule instructions efficiently was a big part of it. But, I don't know Pat's involvement with that, but you could pin that on him if he was deeply supportive of it. But, that started with HP a long time ago, before Pat was CTO, so maybe he should have killed it sooner, but he certainly wasn't the impetus behind it. In any case, if you make decisions, not everyone is going to be correct.
Wow, you've got me here. What a brain fart! I wasn't intending to discuss Itanium or IA64 but that's what I typed. Itanium has been dead for a dog's age. Obviously the white paper I linked to was a proposal for simplifying 64bit x86. Senility or early onset Alzheimers might be setting in. I was associated with Itanium during part of its silicon development, so I must have a real short circuit going on.
I think he's a great CEO, not a perfect one.
Everyone has an opinion. I'm less convinced.
 
IDM had some other bad consequences, though not quite at the same level. x86 capture of the fabs meant that x86 captured Optane. A rational way to develop a new product would be to maximize its potential market by making it available everywhere, as well as working with potential customers early to understand its strengths and weaknesses. Instead x86 capture made it work only with a secret, proprietary extension to the Xeon bus, and paranoia meant customers did not get their hands on it until it was supposedly in high volume production, at which point its weaknesses (amplified by the weird bus) soured most potential customers.
Intel never fab'd Optane. It was a joint development with Micron, and Micron fab'd it. Optane used as a DRAM extension technology required memory controller changes and instruction set changes. This paper, though a bit intense, describes much of the issue.


I'm not sure if Intel really intended to keep Optane DIMMs proprietary, or whether they just never figured out how to enable the industry.
 
When those two were CEOs, many companies had their own fabs, including AMD. IBM. And too many others I can't even remember.

Intel made a lot more money than TSMC, and even TSMC commented how Intel could get much higher margins. Given we're talking about the 20th century, and how companies made money by using their fabs to get the best chips out there, it's not clear how this made sense for Intel.

Keep in mind, when the 486 came out, it was $1000. Yeah, in 1989 dollars. That's where the money was.

I don't see how TSMC would factor into anything. As an investment? Maybe. But, Intel had much better fabs and tech than TSMC, why would they want to buy them? What possible good would it do at that time.

Now, if you want to say Intel should have considered outsourcing much sooner than they did, well, you could make a really strong argument. I think the 20th century was too soon though, because the competition in processors was very intense, and the cost was crazy high compared to today. But, even NVIDIA said a long, long time ago that Intel should open its fabs for others. So, I agree they should have done it sooner, I just don't think 1995 or so was that time. The money was in selling your designs, not making other people's. But when it became obvious that silicon was going to be in all sorts of devices, Intel should have pivoted to that. But I think being an anchor investor in TSMC made less sense than opening up their own fabs, given how much better their tech was.

As a major company's CEO or chairman, it's their responsibility to bring the outside world into Intel. When the world semiconductor business model started changing 30+ years, Intel's leaders couldn't see the changes and failed to make necessary change at Intel. I don't know what exactly a particular and new business model was more appropriate to Intel. But I do know Intel's past leaders falsely think they didn't need to change because Intel was making a lot profit. Without right business model and vision, Intel walks itself into today's difficulties and scrambled to fight its own survival.

The world is/was not obligated to accommodate Intel. It's up to Intel's leaders to make Intel relevant to the world. The best time for Intel to change itself to whatever business model was when Intel was making a lot of money. But unfortunately Intel is forced to make changes today when many Intel's competitors are much bigger or better and Intel itself is much weaker.
 
Last edited:
As a major company's CEO or chairman, it's their responsibility to bring the outside world into Intel. When the world semiconductor business model started changing 30+ years, Intel's leaders couldn't see the changes and failed to make necessary change at Intel. I don't know what exactly a particular and new business model was more appropriate to Intel. But I do know Intel's past leaders falsely think they didn't need to change because Intel was making a lot profit. Without right business model and vision, Intel walks itself into today's difficulties and scrambled to fight its own survival.

The world is/was not obligated to accommodate Intel. It's up to Intel's leaders to make Intel relevant to the world. The best time for Intel to change itself to whatever business model was when Intel was making a lot of money. But unfortunately Intel is forced to make changes today when many Intel's competitors are much bigger or better and Intel itself is much weaker.

"I don't see how TSMC would factor into anything. As an investment? Maybe. But, Intel had much better fabs and tech than TSMC, why would they want to buy them? What possible good would it do at that time."

It's exactly one of the Intel's problems or intel's past CEOs weakness. They couldn't see the big changes in the semiconductor industry while many other not only recognized it but also embraced it. Many of them are doing very well today.

Not too long ago intel was still looking down at the fabless and foundry business. Intel's past leaders were lacking real long term vision and failed to react even when the situation was so obvious.

IDM 2.0 is an attempt to correct Intel's past stubbornness and ignorance. It's a very difficult job for Pat Gelsinger. I won't blame it on Pat.
 
I guess I need to update this one:


Wow, eight years ago in 2014 when we were all much younger and there's a sigh that big troubles were coming to Intel.


From @asic_designer:

"What I have been implying is that Intel got spoiled playing in the PC world with almost no competition and locked into the leading edge arrangement with Microsoft to the point where pretty much everyone could not compete. They would go out and try to kill off competition, so much so that AMD won the huge Antitrust suit against them for this.

Nobody there knows how to actually really win a market when they have to go out and build it from the bottom and be hungry. That was my point about the CEOs. There is all this marketing hype about what they CAN do, and what the MIGHT do, and very little action about what they HAVE DONE. Even their HYPE on 14 nm is mostly that. They are falling behind because they do not have clea direction and are tryint to basically change the process to meet the whims of the marketing guys. Lower power, yet try to keep the PC performance for servers. Not one process, but a few in the background. Missing yields and timelines.

So they try contra revenue ( just buy the sockets down to zero and some fool will buy their product) to the tune of $4B, and still have little to NO revenue. They have had an open foundry deal out there through Open Silicon for a couple of years now, but every partner that OS bring to them, gets rejected because it might somehow “compete”. Intel goes out of their way to go learn, interfere, and control YOUR market as a foundry, such that anyone who uses them would be a fool, since their agenda in the background is to go win YOUR market away from you. They did this in 1995-1998 in a big way. Qualcomm used them as a foundry, but their interference into the market without a direct presence killed that relationship. They spend all their time running intelligence gathering schemes on you and not on making you successful as a customer. This is why TSMC and GF will win out in foundry.

How much the market would have changed IF they had just used Qualcomm as the overflow foundry that they had. At the time, they were selling one technology behind the CPUs and it was good process. But they sent legions of marketing types out to try to win the wireless market behind the scenes, control the technology direction to their advantage, and basically screwed up wireless data for about 5 years until Qualcomm bailed on them for strategic reasons.

Now they float listlessly with the new COE, trying to find someway to stop the bleeding. It would take a major culture change to get the ship right, IMHO."
 
Last edited:
If we want to start the blaming game, Gordon Moore (Intel CEO 1975-1987) and Andrew Grove (Intel CEO 1987-1998) must take some responsibilities too.

TSMC was actively recruiting an anchor investor before its founding in 1987. Intel denied TSMC's invitation and Philips took it as a 28% founding investor. The rest is history.
One of Philips Semis better decisions. I wonder if they made more money from that than anything else they were doing at the time. It's curious how Philips' role in getting TSMC started is so little remembered and how little credit they get for it.
 
If we want to start the blaming game, Gordon Moore (Intel CEO 1975-1987) and Andrew Grove (Intel CEO 1987-1998) must take some responsibilities too.

TSMC was actively recruiting an anchor investor before its founding in 1987. Intel denied TSMC's invitation and Philips took it as a 28% founding investor. The rest is history.
Let's double down on your heresy.

There's certainly a case to be made that the Intel culture embedded under Andy Grove whilst enormously successful in the environment he was operating was one which was too rigid and resistant to change when conditions eventually changed. There's certainly an argument that Intel's problems over the last decade or so have been as much cultural as technical.
 
Let's double down on your heresy. There's certainly a case to be made that the Intel culture embedded under Andy Grove whilst enormously successful in the environment he was operating was one which was too rigid and resistant to change when conditions eventually changed. There's certainly an argument that Intel's problems over the last decade or so have been as much cultural as technical.
I would argue the later. Andy and Gordon seemed to be data driven and had a culture of adaptability. Killing memory with the stroke of a pen and the whole concept of strategic inflection points is very much something that is not congruent with rigidity and a lack of adaptability. To add on; during an old interview I saw from Pat back when he was at VMW, said back in the day intel was going to go risc. Pat was upset and thought it was a mistake. Andy then said something to the tune of “show me” and did a parallel development to let the silicon do the talking The 486 turned out better and then the rest is history. I don’t think that leadership would have spun off the design side in 2005. But that is because very clearly there is nothing wrong with staying an IDM provided you have the scale and to justify whatever nodes you want to make, rather than a result of a culture that is not accepting of critical reassessment.
 
Last edited:
during an old interview I saw from Pat back when he was at VMW, said back in the day intel was going to go risc. Pat was upset and thought it was a mistake. Andy then said something to the tune of “show me” and did a parallel development to let the silicon do the talking The 486 turned out better and then the rest is history.
The primary argument for staying with the x86 architecture back then was software compatibility, which is valid. The competing RISC architecture, the i860, was a different animal altogether, and was targeted as a floating point monster and for HPC systems. (Back then Intel had a supercomputing product division.) The wild success of the 386 products made the 486 path the obvious choice, business-wise. Nonetheless, politics have always driven Intel as much or more than objective analysis. In every initiative I've seen Intel create or ones in which I was a participant, internal politics were paramount, not objective analysis.

Long ago Intel did take more technical risks, design-wise, than they do today. The first Intel CPU that ever fascinated me was the iAPX 432, which was very innovative, but too slow to be practical. Was Gordon Moore smart to approve it, or was he out of his element and talked into a bridge too far? I think the latter, but for a long time several intel technical leaders came out of that project. In the modern Intel I can't think of any senior people who rose the ranks after "innovative failures". I will say this, following a poorly conceived strategy over a cliff is not a good idea.
 
Back
Top