Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/index.php?threads/intel-ceo-highlights-the-company%E2%80%99s-top-three-mistakes.19092/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021370
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

Intel CEO Highlights the Company’s Top Three Mistakes

Daniel Nenni

Admin
Staff member
intel ceo pat gelsinger holding a silicon fab


In a recent interview with Digit, Intel CEO Pat Gelsinger talked about the company’s three big misses. In the last decade, there have been several technological innovations and big shifts in the industry. Gelsinger seems to regret not being able to take advantage of their unique position to capitalize on certain opportunities. So, here are Intel’s three mistakes, according to their CEO.

Intel CEO Pat Gelsinger Points Out Missed Opportunities:

In the interview, Gelsinger revealed three different areas in which he thinks Intel missed out. The first is Intel’s foray into the smartphone market. He said, “Yeah, we missed the mobile wave,” essentially saying that Intel isn’t significantly involved in the smartphone market. If you remember, Intel chips did exist in smartphones that used Intel processors back in the day (2004).

One example is the Samsung i700, which was based on an Intel PXA250 CPU with 400MHz speed. This was years before the first iPhone came out (2007). But today, Intel isn’t involved in the smartphone market.

The next thing Intel’s CEO discussed was the Larrabee project, saying that “we killed the one that would have made all the difference in the world.” This was the American chip maker’s attempt at making a proper graphics card.

Today, that vision has come to fruition in the form of Intel Arc graphics, which have recently gotten much better with newer driver updates. Larabee was going to be a graphics processor made from original Pentium-based CPUs. It aimed toward being a decent enough “GPGPU” (General Purpose Graphics Processing Unit) to accelerate all sorts of graphics-focused workloads.

However, the Larrabee project was shelved in 2009. The cancellation of Larrabee graphics happened soon after Gelsinger was pushed out of the Intel company. A few years later, he returned to Intel and became the CEO in 2021.

Intel CEO Highlights the Company’s Top Three Mistakes


Lastly, Gelsinger talked about how Intel was fundamentally biased to building a great foundry.” Essentially, Pat aims to tell us that Intel wanted to be a top-level semiconductor manufacturer. However, this plan somewhat backfired.

While we don’t know the exact details, maybe this ambition caused a delay in innovation and adopting newer manufacturing processes like EUV lithography, which is used for Meteor Lake chips. The 14nm node of Intel was used for a long time. It was during this time that TSMC gained a massive lead along with AMD who launched Ryzen processors, which have become popular today.

Siliconomy is a term that Intel coined, stating that semiconductor manufacturing is indirectly responsible for impacting $11 trillion of the global GDP (and $3 trillion in direct impact). Gelsinger also talked about how missed opportunities are always going to happen because it’s hard to get everything right. With the impending launch of 14th Gen Meteor Lake laptop chips next month, we will soon see Intel’s biggest shift in microprocessor architecture that has happened in the last few years.

So, what do you think of Intel’s mistakes? Do you think they could have been a bigger company if they never failed on the above plans? Let us know in the comments below.

 
Last edited:
Those "mistakes" are all focusing on products, and ignoring the massive elephant in the room which was the multi-year 10nm process disaster...
A disaster made worse by inability to analyze it, communicate it internally, and pivot the product plans to work with reality.
 
On the design side, the biggest mistake Intel made was only being interested in CPUs and their associated chipsets for desktop, client, and datacenters. Everything else (memory [Optane], storage [flash, SSDs], networking [NICs / superNICs, switches, wireless, 5G modems, etc.], driving assists, AI chips, GPUs, silicon photonics, comms processors... I could probably add a few others to this list... gets cut, cancelled, or sold off. And the biggest advocate of this CPU-centric strategy in my personal experience is the current CEO. Intel misses out on or runs away from one growing market after another.

I'm anxious to see how IFS plays out. It's too early to tell, but I'm hearing good things about progress so far. I'm skeptical though, neither Gelsinger nor Esfarjani are fab or foundry visionaries, and I doubt either one could spot another 10nm-like disaster in the making.
 
On the design side, the biggest mistake Intel made was only being interested in CPUs and their associated chipsets for desktop, client, and datacenters. Everything else (memory [Optane], storage [flash, SSDs], networking [NICs / superNICs, switches, wireless, 5G modems, etc.], driving assists, AI chips, GPUs, silicon photonics, comms processors... I could probably add a few others to this list... gets cut, cancelled, or sold off. And the biggest advocate of this CPU-centric strategy in my personal experience is the current CEO. Intel misses out on or runs away from one growing market after another.

I'm anxious to see how IFS plays out. It's too early to tell, but I'm hearing good things about progress so far. I'm skeptical though, neither Gelsinger nor Esfarjani are fab or foundry visionaries, and I doubt either one could spot another 10nm-like disaster in the making.
But was it a mistake ?

Intel's blessing - and curse - was its dominant position with typically over 80% market share (and pricing power) in perhaps the highest margin semiconductor segment (microprocessors). Protected from much effective competition for 2 decades by the WinTel PC moat and with a secure, growth market what other products could ever achieve the same margins and compete ? It's only when the PC market demand isn't large enough to maintain the fab investment that expanding the product portfolio becomes a necessity rather than a luxury. Which is arguably where we are now with an ultimate choice between growing a foundry business (growing its own product portfolio would probably be too slow now) or going fabless.

It's always a problem when you have such a successful and dominant product line. Who wants to work in the less successful areas ? And just how much margin dilution are investors prepared to tolerate for increasing sales ?

Is it any surprise that Intel lacked the stamina to see through so many of its sideline products ? Shouldn't we ask instead, why not stick to the core business and the things they uniquely do well ?

In some ways, Intel's looked like a classic "cash cow" business for some time. Albeit one that needs massive technology and capital investment to maintain.

Spare cash could be invested in promising start-ups without direct ownership or management (I know, that's what Intel Capital does and it's not clear it's been that successful - interesting to know in Intel Capital has produced better or worse returns than the new product groups and largely failed acquistions).

This all reads like a recipe for a fabless "core Intel" business to serve the legacy x86 business and run for profit and cashflow in a more stable, lower growth market. That wasn't my intention when I started writing this comment ! But all products have lifecycles and x86 processors are ultimately no different. We just don't know how long they have to run. But there's certainly a lot of profitable silicon still to be made. And Intel internal fab is not necessarily the way to do that.
 
Previous IFS attempts have failed. In my opinion it was completely mismanaged, it was not an executive focus and infighting doomed it. Intel certainly has the technology but they lacked the outbound foundry experience and ecosystem. This time around IFS is a focus and as long as Pat Gelsinger is CEO it will continue to be. The problem I see for IFS is expectations. It takes years to build a successful foundry business that can scale revenue. Five years at a minimum. Samsung has been at it for what, 15 years? Apple was their first big customer with the iPhone until Apple switched to TSMC with the iPhone 6 (20nm).
 
But was it a mistake ?

Intel's blessing - and curse - was its dominant position with typically over 80% market share (and pricing power) in perhaps the highest margin semiconductor segment (microprocessors). Protected from much effective competition for 2 decades by the WinTel PC moat and with a secure, growth market what other products could ever achieve the same margins and compete ? It's only when the PC market demand isn't large enough to maintain the fab investment that expanding the product portfolio becomes a necessity rather than a luxury. Which is arguably where we are now with an ultimate choice between growing a foundry business (growing its own product portfolio would probably be too slow now) or going fabless.

It's always a problem when you have such a successful and dominant product line. Who wants to work in the less successful areas ? And just how much margin dilution are investors prepared to tolerate for increasing sales ?

Is it any surprise that Intel lacked the stamina to see through so many of its sideline products ? Shouldn't we ask instead, why not stick to the core business and the things they uniquely do well ?

In some ways, Intel's looked like a classic "cash cow" business for some time. Albeit one that needs massive technology and capital investment to maintain.

Spare cash could be invested in promising start-ups without direct ownership or management (I know, that's what Intel Capital does and it's not clear it's been that successful - interesting to know in Intel Capital has produced better or worse returns than the new product groups and largely failed acquistions).

This all reads like a recipe for a fabless "core Intel" business to serve the legacy x86 business and run for profit and cashflow in a more stable, lower growth market. That wasn't my intention when I started writing this comment ! But all products have lifecycles and x86 processors are ultimately no different. We just don't know how long they have to run. But there's certainly a lot of profitable silicon still to be made. And Intel internal fab is not necessarily the way to do that.

Intel's hubris is legendary. Intel once said foundries could not go where IDMs can for process development. Clearly that is not the case. More recently Intel said "AMD is in the rearview mirror". Pat seems to be getting a little more humble as his foundry experience grows, absolutely.
 
But was it a mistake ?
In my opinion, absolutely. Workloads evolve, new workloads appear, old workloads are marginalized. Synergies between product lines can lead to massive revenue opportunities. CPUs are often not the best solutions for power efficiency, best performance, or best cost performance. Intel's single-minded approach, led by the Gelsinger whenever he was around, is in my opinion one of the reasons why Nvidia's market cap is about 6x of what Intel's is.
Is it any surprise that Intel lacked the stamina to see through so many of its sideline products ?
No. It's called lack of vision and foresight, and application understanding.
Shouldn't we ask instead, why not stick to the core business and the things they uniquely do well ?
What if the core business is becoming marginalized by different use models? This is why Intel missed mobile and handheld.

Spare cash could be invested in promising start-ups without direct ownership or management (I know, that's what Intel Capital does and it's not clear it's been that successful - interesting to know in Intel Capital has produced better or worse returns than the new product groups and largely failed acquistions).
Intel Capital's charter is not to make returns, it is to foster growth in Intel's ecosystems.
 
Those "mistakes" are all focusing on products, and ignoring the massive elephant in the room which was the multi-year 10nm process disaster...
I would classify the three listed mistakes as issues with intel's strategy or its vision. 10nm is more of a tactical/execution mistake. Granted I could see an argument that there was a strategic mistake of not putting enough resources in place to deal with the massive complexity increase from the rapid proliferation of multi-patterning, or that too much focus was being diluted due to X-point and NAND TD. Regardless of the pedantic issue of strategic or tactical mistakes, intel's 10nm woes were extremely disruptive in a strategic sense, and might well have been the single largest contributing factor to their current poor situation.

I'm anxious to see how IFS plays out. It's too early to tell, but I'm hearing good things about progress so far. I'm skeptical though, neither Gelsinger nor Esfarjani are fab or foundry visionaries, and I doubt either one could spot another 10nm-like disaster in the making.
Out of curiosity what your issue is with Keivan and Ann, as both have a good bit of manufacturing and R&D expertise?

On the design side, the biggest mistake Intel made was only being interested in CPUs and their associated chipsets for desktop, client, and datacenters. Everything else (memory [Optane], storage [flash, SSDs], networking [NICs / superNICs, switches, wireless, 5G modems, etc.], driving assists, AI chips, GPUs, silicon photonics, comms processors... I could probably add a few others to this list... gets cut, cancelled, or sold off. And the biggest advocate of this CPU-centric strategy in my personal experience is the current CEO. Intel misses out on or runs away from one growing market after another.
GPUs and not really doing anything with their AI IPs until relatively recently are definitely a big miss. The selling off photonics bit was just the plugable transceivers portion, right? If that is the case isn't that a low value add part of the photonics pie (at least compared to the on-package parts of it)? 4/5G modems also seems like a waste of time after intel's mobile play went bust. Apple will eventually go to an integrated solution, and everyone else already had their own solution baked into their SOCs. As for the memory business, FPGA, and Mobileye stuff I liked it in theory. In practice isn't having these things as an adjacency to sell more CPUs a mistake as it makes it harder to sell those products if they must be tied to Xeon or intel laptops? Would it not be better for them to do the Mobileye thing so they can do what they need to do to have successful products rather than pumping up NEX or DCAI revenues?
 
Would it not be better for them to do the Mobileye thing so they can do what they need to do to have successful products rather than pumping up NEX or DCAI revenues?
This was a big part of what went wrong with Xpoint. They built on a scale to go after an exabyte industry, but made a product that only worked with Xeons at a premium price, which was a petabyte niche market strategy. Inventory piled up and fabs were idled.

Exabyte was right but they should have figured out how it could be used with any host, and had a model based on lowering costs. But, when Xeons run the strategy, everything has to be contributing to Xeon success, not diversification.

They were not building NEW business lines. No surprise they now trim back to the only business they believed in.
 
Out of curiosity what your issue is with Keivan and Ann, as both have a good bit of manufacturing and R&D expertise?
I have no problem with Ann at all. She has DEEP fab technical and managerial experience, but now she's moved into technology development. Keivan is a logistics guy, not a fab guy, and I suspect he falls into the loyal-friend-of-the-CEO category. Highly technical functions need a highly technical top decision-maker. I don't know Keivan at all, but by his bio he isn't that.
GPUs and not really doing anything with their AI IPs until relatively recently are definitely a big miss.
Yup, and losing Raj Koduri is indicative of some real dysfunction at the corporate strategic level. I bet the politics would make a great movie. IMO, Intel really blew it with GPUs.
The selling off photonics bit was just the plugable transceivers portion, right?
I was thinking of Si Photonics, which Intel had been working on for many years, and then cancelled it.
If that is the case isn't that a low value add part of the photonics pie (at least compared to the on-package parts of it)?
Agreed, transceivers are probably not an Intel-class investment with Si Photonics on the horizon.
4/5G modems also seems like a waste of time after intel's mobile play went bust.
Not IMO. I think it was a failure of leadership.
Apple will eventually go to an integrated solution, and everyone else already had their own solution baked into their SOCs. As for the memory business, FPGA, and Mobileye stuff I liked it in theory.
Thank you for bringing up FPGAs. I knew my list was missing a big one. I don't think Intel really tried to make FPGAs a leading product group. Xilinx (AMD) I think did a lot better. As for Optane, what a mess, but Gelsinger had little (if anything) to do with it. For some strange reason Intel engineers seem to have totally missed the coming NAND chip layering strategy. Optane is apparently much (MUCH) more difficult to layer (in "decks"), and lost the density race big time. Also, Optane specs don't seem to live up to promises I remember from a decade ago. And needing a DRAM cache in front of Optane DIMMs added a lot of complexity and cost. Lots of promise, apparently lesser reality.
In practice isn't having these things as an adjacency to sell more CPUs a mistake as it makes it harder to sell those products if they must be tied to Xeon or intel laptops?
Agreed. I never liked Intel's strategy that everything had to be a CPU adjacency. That strategy always handcuffed the other technologies.
Would it not be better for them to do the Mobileye thing so they can do what they need to do to have successful products rather than pumping up NEX or DCAI revenues?
Exactly.
 
Last edited:
This was a big part of what went wrong with Xpoint. They built on a scale to go after an exabyte industry, but made a product that only worked with Xeons at a premium price, which was a petabyte niche market strategy. Inventory piled up and fabs were idled.
Optane needed a DRAM cache for memory extension, which added cost and complexity. For SSDs, not only didn't it make $/GB sense, but to realize the speed the SSDs needed a superfast controller, and Intel didn't have it. The best Optane SSD design win I'm aware of is in Vast Data storage appliances as a write cache.
 
I have no problem with Ann at all. She has DEEP fab technical and managerial experience, but now she's moved into technology development.
I asked because you were talking about being able to catch the next "10nm" before it is too late to readjust which would be an issue that is in her wheelhouse.
Keivan is a logistics guy, not a fab guy, and I suspect he falls into the loyal-friend-of-the-CEO category. Highly technical functions need a highly technical top decision-maker. I don't know Keivan at all, but by his bio he isn't that.
If memory serves he was in charge of NVM manufacturing in the joint Micron-Intel days and the post split days. He also supposedly used to be factory manager of NAND TD at D2 and later fab68.
I was thinking of Si Photonics, which Intel had been working on for many years, and then cancelled it.
Did they really cancel even the in package part? I could have sworn it was just the pluggable optical connectors part? If the full photonics initiative was canned that is truly a grave blow to DCAI and IFS. It also feels like that kind of defeats a good part of the value prop of moving to glass substrates if you don't have any photonics to go with it.
Not IMO. I think it was a failure of leadership.
Could you expand upon this? I don't really know why you would bother to continue making modems if you already left the business of making mobile SOCs, and your only potential customer planned to develop their own integrated solution?
Thank you for bringing up FPGAs. I knew my list was missing a big one. I don't think Intel really tried to make FPGAs a leading product group. Xilinx (AMD) I think did a lot better.
That was my point. If memory serves there was a big push in the BK and BS days of making FPGA accelerator cards to supplement Xeons? Of course we all saw the end result of intel (and to a lesser extent Xilinx) focusing only on the high end. The low end got stolen away from intel/Xilinx by all of those smaller start ups as they competed against the older lineups from intel/Xilinx. I wonder if AMD will make the same mistake and only focus on how Xilinx can help AMD, rather than how AMD can help Xilinx?
As for Optane, what a mess, but Gelsinger had little (if anything) to do with it. For some strange reason Intel engineers seem to have totally missed the coming NAND chip layering strategy. Optane is apparently much (MUCH) more difficult to layer (in "decks"), and lost the density race big time.
I have a hard time imagining that neither intel nor Micron knew that 3D NAND was coming given they were relatively early movers on 3D-NAND. Heck their announcement was after they had announced their 32L NAND. As an outsider looking in; they probably thought there would be more workloads that valued the latency over NAND, or the persistence and capacity over DRAM. Couple that with X-Point seemingly being late, and the 4 deck version coming out years after the 2 deck version and the economics would have been uglier than either party would have originally expected.
 
Here was another historical milestone which typically is forgotten. Intel used to license Atom core to TSMC to compete with ARM core which was emerging in 2009.
Too little, too late. ARM was already shipping hundreds of millions of mobiles and there was a complete ecosystem around AMBA and a range of cores you could choose. The big customers were already getting briefed on v8 (64 bit ARM). Apple was already deep working on its first in-house chip and the iPhone was zooming up the charts.

Intel should have been able to project the importance of mobile 10 years before that. The exponential sales curves and rapid rise in functionality were clear. Intel was oblivious, Atom was sad.
 
Optane needed a DRAM cache for memory extension, which added cost and complexity. For SSDs, not only didn't it make $/GB sense, but to realize the speed the SSDs needed a superfast controller, and Intel didn't have it. The best Optane SSD design win I'm aware of is in Vast Data storage appliances as a write cache.
Intel simply had no clue what the market might be. And no concept of the real value. They knew it could not beat NAND on price, but they did not understand it could not match DRAM value because of perf. They thought it would sell like hotcakes because of persistence at a price higher than DRAM, but the reality is that DRAM is persistent with the power on and durable only when replicated to multiple separate systems, and software that worked like that was mature, so the only folks interested in persistent main memory were running obscure experiments.

When we told Intel the price had to undercut DRAM because the perf was poor and persistence of little interest, my impression was they thought we were lying and it was just a negotiation tactic. They just kept going, and just as predicted it did not sell.

Then, the inability to attach their DIMMs to anything other than a Xeon made vendor lock-in a red flashing problem. It did not help that it sucked bandwidth away from the DRAM you needed to support it, and the first version had problems with mixing reads and writes that dragged perf way down. Which we could have found and fixed if they were not so secretive before releasing it. Not Intel's finest effort.

I still believe the tech could have won coming in at 1/3rd the cost of DRAM (about 20x the cost of NAND) if they had adopted CCIXX (which already worked like CXL.mem) for open attach on a larger market, and worked openly with customers to try out the use cases. The manufacturing looked like that price would be ok, and there is nothing that motivates customers like a price cut on their most expensive system component (memory).

How is this relevant to their future? Well, that Optane-style inward-looking product development culture will not work well for success in the IFS market. The IDM model's weakness is how inward looking they are, xeons and fabs locked in an internal embrace which is stale and brittle. TSMC shows how much more vibrant the IFS model is, and Intel will need to split to really ensure they shift their culture.
 
If memory serves he was in charge of NVM manufacturing in the joint Micron-Intel days and the post split days. He also supposedly used to be factory manager of NAND TD at D2 and later fab68.
Not technical enough to suit me.
Did they really cancel even the in package part? I could have sworn it was just the pluggable optical connectors part? If the full photonics initiative was canned that is truly a grave blow to DCAI and IFS. It also feels like that kind of defeats a good part of the value prop of moving to glass substrates if you don't have any photonics to go with it.
I think they canned the first iteration of Si Photonics because it needed special wafers.
Could you expand upon this? I don't really know why you would bother to continue making modems if you already left the business of making mobile SOCs, and your only potential customer planned to develop their own integrated solution?
5G modems are a very high volume part, and quite complex so there's a high barrier to entry. Right up Intel's alley, but RF stuff is not an Intel strong point.
I have a hard time imagining that neither intel nor Micron knew that 3D NAND was coming given they were relatively early movers on 3D-NAND.
Me too. But I can't explain some of the material I saw any other way. And this was ~2010.
Heck their announcement was after they had announced their 32L NAND. As an outsider looking in; they probably thought there would be more workloads that valued the latency over NAND, or the persistence and capacity over DRAM. Couple that with X-Point seemingly being late, and the 4 deck version coming out years after the 2 deck version and the economics would have been uglier than either party would have originally expected.
Agreed.
 
“we killed the one that would have made all the difference in the world.” This was the American chip maker’s attempt at making a proper graphics card..
Larabee was going to be a graphics processor made from original Pentium-based CPUs.

Let me say the obvious. It was NOT an attempt to make a graphics card. A graphics card does graphics. What was Larrabee? It was lots of CPU cores in one chip. You can call is a GPGPU all you want. Instead of actually dedicating themselves to build a damn GPU. Instead they said to themselves "Hey lets just get a bunch of small CPU's, mash'em together and we'll see if they can do any good as a GPU. Fun Fact, They can't. Who would have thunk it? Just about everyone who was into tech.


Intel couldn't get into the smartphone business. No shoit Sherlock.. Here we have the premier league microprocessor company in the world, dilly dallays around, does a half hearted effort at an atom processor offering. Atom. Might have well called it slow as all hell and sips too much power. Atom was basically already around for all those tiny laptops, which were slow as. And instead of thinking to themselves... maybe, just maybe, even if we can't get them using x86 processors, and therefore lock them into our ecosystem, which is what Intel were trying to do of course. They had green $$ in their eyes already, just daydreaming about the $$, but it never came, because maybe, ARM was just that much more efficient, and better, and let the customers not me stuck in some dead end, greedy, grubby, monopolistic, x86 monagery. Nooo, Instead of thinking "what does the customer want?", "Can we provide that for them in x86?", "Well it appears we can't, maybe we will get into the business of providing top notch ARM chips for them to buy so we can get that revenue and margin." But could they do that? NO! We will just stick with PC chips and Server chips and that's it! Oh Wait, We will buy an FPGA company, then years later sell it, then buy it again years later. OH We will buy a VIDEO GAME COMPANY AND DO NOTHING WITH THEM OR THEIR GAME!!! Could have seen that coming a mile away. Lots of people thought "this game looks cool", "I wouldn't mind playing that". But no! You buy them and do nothing with them wasting that money away. So much waste on money it's unbelievable. Instead of offering to build top notch ARM chips to sell to smartphone makers, you just sit there watching the GOLDBOOM go by as you spend years doing share buybacks with your bailout government money, and the huge lucrative market you had the majority of for AGES in the server and pc client side. Fabs? R&D. Nah, Share buybacks. This won't come to bite us in the arse. MediaTek reported $19 billion in revenue for the year. 47% margin. Qualcomm $44.20 Billion Revenue, Margin 22.98%. Could Intel have made a ton of Revenue and Income if it said to itself "Maybe x86 sucks in a handheld for a phone". Maybe for that particular niche, We will get an ARM license and try to supply everyone with the chips, en-masse. NOPE! Can't pull their heads out their own buttocks.

Nvidia getting into the ARM processor business? YES. AMD getting into the ARM processor business? YES. Google just hinted at wanting RISC-V to have first class support in the google eco-system. Perhaps Intel a microprocessor company might want to start making some Risc-V chips? Be a provider of Risk-V Chips? Get in Early?
NUP! No way!

Did you hear that Intel wants to sell Mobile-eye? Mobile eye's growth is quite large, YoY. LETS SELL IT the greedy little green $$ Intel people say! We don't want any of our company to have huge amounts of growth!

Let me tell you the story of Intel.

Intel turned into Icarus, flew too close to the Sun, thought it could buy the Sun with all it's money, thought it was a Sun God. I don't know how much faith I have left in Intel at all. From all I can see, It's bad news.
 
Don't forget the massive layoffs of 2015/2016, which were done for basically no reason and further alienated a workforce that was already struggling to manifest stuff like 10nm and Optane. They thought replacing their most experienced engineers with H1B's and PhD's would shake things up and send the right message, a real bean counter/HR sicko mentality running the show.
 
Back
Top