Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/index.php?threads/ces-2021-intel-new-cpu-question-about-the-10nm-yield.13594/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021370
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

CES 2021 Intel new cpu. Question about the 10nm yield

Intel has been talking about their 10nm process since at least 2015, and has reported many delays since then in achieving viable yields. It's quite a sad story, especially since Intel is one of the few remaining American semiconductor manufacturers. There are many blogs on this topic here on SemiWiki.
 
Does anyone know the cause of poor yields? I have heard Cobalt or lining of Cobalt or Quad Patterning. If it's Quad, they should have moved to EUV earlier, but then again I am biased toward EUV.
Early on Intel said they tried to do too big a "shrink" in dimensions, jumping one 1 1/2 or 2 nodes at once and that this was the cause of poor yield (probably litho). Also, adding new conductor (Cobalt) doubled their problems.
 
Last edited:
Does anyone know the cause of poor yields? I have heard Cobalt or lining of Cobalt or Quad Patterning. If it's Quad, they should have moved to EUV earlier, but then again I am biased toward EUV.
The secrecy at Intel is legendary. When I worked in a little building in Santa Cruz, CA with only 12 people the security folks from Santa Clara would randomly send over an unannounced stranger to see how much intellectual property that he could steal from our building, and the good news is that we caught the "thief" every time. We even placed nonsensical watermark mask geometries into each IC layout, just in case a competitor stole the masks or reverse engineered the layout, so that we could prove in a court of law that the watermark had also been literally copied. Any Intel employee that answered your yield question would be immediately fired and likely taken to court. So good luck with that.
 
Rumors do trickle out of a fab, esp since vendors are in the fab trying to improve the equipment/process that is "causing" the problem. Even a bystander in the fab can see where all the effort is going when there is a yield problem.
 
Rumors do trickle out of a fab, esp since vendors are in the fab trying to improve the equipment/process that is "causing" the problem. Even a bystander in the fab can see where all the effort is going when there is a yield problem.
Yes, and any vendor that leaks what is happening inside of an Intel fab will never set foot in Intel again, and likely end up in court, because they must sign a Non Disclosure Agreement.
 
Intel show off their latest 10nm cpu. My question is from my own research(googling). There is a news about their 10nm "are nowhere near viable for “full production”. this news is about 4 weeks ago. Did Intel fixed their 10nm yield problem or Notebook Check is wrong?

Daniel Payne is correct, yield is a forbidden subject. We generally find out about yield problems after the fact. There was however an interesting presentation at IEDM last month:

41.2. Intel's take on design-technology co-optimization (DTCO)

Over half of the discussion was bemoaning the fact that EUV was "really, really late". It meant that they had to re-design for more restrictive design rules -- e.g., unidirectional low-level metal layers. They showed pictures of rather intricate M1 wires in custom circuit layouts assuming that EUV would be ready for 10nm, layouts that could not be resolved easily with 193i litho. So, they had to split the layouts into two unidirectional metal layers.

After they were done complaining about the availability of EUV, they finally mentioned that "we have started pursuing cell designs consistent with first-generation EUV based on some DTCO simulations, and are planning on High Aperture EUV in the near future". HA EUV is not really near however (2025?).

OK, perhaps they were relying too heavily on ASML, who were late delivering on production EUV. And, perhaps they should have taken a more prudent approach toward EUV like TSMC (although TSMC's customers primarily use cell libraries, and do not rely extensively on custom cell layout.

Bottom line: The very aggressive goals for Intel's 10nm process came back to haunt them...
 
EUV could have been earlier if they had not decreased funding for it at a critical stage in its development in 2005. They had just developed 193i with a researcher at RIT, as I recall
 
I think Daniel Payne is right about where Intel fab info does NOT come from (vendors). I recall the info about litho problems with the aggressive shrink came from an actual Intel disclosure 4+ years ago. Ditto for Cobalt.
 
Speaking of Intel 10nm, here is a recent headline:

2nd Gen AMD EPYC Processor is the World’s First 7nm x86 Datacenter Processor

As we all know Intel 10nm is slightly denser than TSMC 7nm so this is a false narrative.

It's deja vu the 14/16nm debacle. Intel came out with 14nm FinFETs which were more dense than the Samsung and TSMC implementations. Out of respect for Intel Morris Change insisted their process be named 16nm. Samsung however stuck with the 14nm name even though Samsung 14nm was the same density as TSMC 16nm. TSMC then had to explain that Samsung 14 and TSMC 16nm were actually the same density to each and every customer. Lesson learned. Hopefully Intel will learn this lesson as well.






 
There's incredible potential with 2, 3, 4 nm nodes as well as demand but there's a question with yields and an insatiable demand.

There will be 3 nm devices by the end of this year or next year according to Rotter but the consumer will have to pay and releases will be limited. Consumers will be stuck with 10 nm for sometime.
 
EUV could have been earlier if they had not decreased funding for it at a critical stage in its development in 2005. They had just developed 193i with a researcher at RIT, as I recall
Intel had been driving EUV before Samsung or TSMC. There wasn't a strong enough source for a long while. Even now, the power level is topping out, so new types are being considered.
 
I think the yield issue is how they are patterning M0 and M1, the whole sequence is unlike anything I have ever seen. I recently saw SEM images of the 10nm Super Fin M0 and M1 layers from TechInsights and the patterns look terrible.
 
I had read earlier (IITC 2018) Al2O3 was used as dielectric between cobalt line ends, I wonder about the complexity of SiO2 replacement.
 
I look at Intel current Core product lineup, for the 11th generation there are 8 models for i3, 8 for i5, 9 for i7 and 10 for the 10th generation of i9 processors. Total there are whopping 35 varieties while two models in each category are designed for embedded systems.

I can understand Intel is trying to tailor their products to various customers' needs. But on the other hand, is that possible Intel just can't get uniform quality or yield from their production? Instead of throwing inferior output, they create a multi tiers rating/naming system to salvage value from every chip they made? I hope I'm wrong on this.
 
Back
Top