Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/amd-says-intels-horrible-product-is-causing-ryzen-9-9800x3d-shortages.21836/page-5
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021770
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

AMD says Intel's 'horrible product' is causing Ryzen 9 9800X3D shortages

Got it - I agree to disagree then :).

They benchmarked games at hundreds of frames per second. First, it surpasses the capabilities of a lot of monitors. Second, it is well over human biological limit.
 

They benchmarked games at hundreds of frames per second. First, it surpasses the capabilities of a lot of monitors. Second, it is well over human biological limit.
Theres a bit more to this than the article states. I'm guessing you've never used a 120hz or higher display for any length of time.

If you use a 120hz monitor for any length of time you can easily tell vs 60hz as the mouse no longer feels smooth dropping down to the slower refresh rate. I'm "middle age" but it's quite obvious even to me.

In VR you get a lot less nausea effects at 90 or 120 hz especially than at 60 hz.

I do think 500fps/hz is ridiculous but I think 120/144 is a great upgrade over 60. 240 - good for competitive fps's but not necessary for a good gaming experience otherwise.
 

They benchmarked games at hundreds of frames per second. First, it surpasses the capabilities of a lot of monitors. Second, it is well over human biological limit.
Also try this at different refresh rates and tell me you can't see the difference (blur test): https://www.testufo.com/

Your eyes/brain get more accurate data during movement at higher frame rates.
 
Also try this at different refresh rates and tell me you can't see the difference (blur test): https://www.testufo.com/

Your eyes/brain get more accurate data during movement at higher frame rates.
My monitor only supports 60Hz. But my iPad Pro is with high refresh rate.

I just think running games at several hundred frames per second is not representative. Then you divide the cost of a gpu by that several hundred fps. I just don't know what that division actual conveys to end users.
 
Last edited:
I have a 240Hz refresh rate display and I can tell the difference between 60/120 and 240hz. We may not be able to visible feel the difference between 120-240hz but we can feel the smoothness of the reactions.
 
I just think running games at several hundred frames per second is not representative. Then you divide the cost of a gpu by that several hundred fps. I just don't know what that division actual conveys to end users.
I think the "Cost per Frame" is mainly useful for when comparing generational uplifts, academically. For example - 5090 pushes more frames than 4090, but costs more. Did they actually give the user a "value" gain, or is it just plain more expensive.

Some people use it as "bang for buck", and it points out GPUs that are pretty ugly values (i.e. RTX 4080).

It's not super useful overall, but it's good academic data to see if the GPU makers are gouging on newer generation cards.

1743517264075.png
 
I have a 240Hz refresh rate display and I can tell the difference between 60/120 and 240hz. We may not be able to visible feel the difference between 120-240hz but we can feel the smoothness of the reactions.
I'm running 160Hz now, and I'm thinking 240Hz for my next display.. good to hear anecdotally you can tell the difference.
 
I'm running 160Hz now, and I'm thinking 240Hz for my next display.. good to hear anecdotally you can tell the difference.
the ability to perceive difference will vary by people at such high refresh rate many can and many can't please keep that in mind
 
I think that focusing on gaming performance for the CPU industry is a mistake as this market is small and shrinking. Gaming consoles will be the future of gaming IMHO.

In fact, desktop PC's in general are not of great strategic importance to x86 IMHO. The future is data center and mobile (including laptop).

In that regard, PC frame rates seems like a very strange metric to gauge the viability of a new CPU architecture.
 
I think that focusing on gaming performance for the CPU industry is a mistake as this market is small and shrinking. Gaming consoles will be the future of gaming IMHO.

In fact, desktop PC's in general are not of great strategic importance to x86 IMHO. The future is data center and mobile (including laptop).

In that regard, PC frame rates seems like a very strange metric to gauge the viability of a new CPU architecture.
One thing worth considering for an average user who both games (even only occasionally) and uses apps. They can probably wait a few extra seconds for a large app to do something or tolerate a web page loading 10% slower. But reduce fps below a certain threshold and it's noticeable.

It sounds like you're not a PC Gamer so this metric is probably worthless for you. But if you are spending $$ on a GPU for gaming (which millions do) it's important to select the right CPU for the intended gaming use cases.

Xyang brought up a YouTube and website dedicated to PC gaming so of course the focus is on gaming applications..

It's hard to say if PC gaming will die in the next 20 years or not. Nvidia wouldn't notice with AI though :). AMD and intel still sell a lot of gaming focused equipment and for AMD it's higher margin than console sales.
 
I think that focusing on gaming performance for the CPU industry is a mistake as this market is small and shrinking. Gaming consoles will be the future of gaming IMHO.

In fact, desktop PC's in general are not of great strategic importance to x86 IMHO. The future is data center and mobile (including laptop).

In that regard, PC frame rates seems like a very strange metric to gauge the viability of a new CPU architecture.

What AMD does: designs a gaming CPU, turns it into a server die later. Gaming and enthusiast client is the trend setter, and direction setter for the wider industry. It was easier to make a chip for interactive multimedia applications, and then chop off whatever is needed from it to make an office PC die, embedded product, or server CPU, than the other way around.

There is a far deeper rationale to aim at this "small and shrinking" sector.

Phone SoCs have basically differentiated into gaming, and non-gaming ones. Ones without gaming cred were pushed out from the market regardless of what they were trying to differentiate into. This is a very old observation: whatever chip for an interactive product you make, if it does not suit gamers, and enthusiasts, it's a dead niche in long-term.
 
Gaming CPUs help sell the product they are marketing star even if it consumes 300W it sets the trend in Market it is basically free marketing in Desktops.
 
All good points.

Integrated graphics is rapidly approaching the "good enough" point for laptop gaming, and laptops for gaming are a tiny percent of the laptop market sold. The lions share of laptops are sold to businesses despite the fact that more people own a personal laptop now than in the past. Note: Personal laptops are ALSO being replaced by cell phones. My wife (as an example) rarely ever touches any PC at home or any laptop.

Clearly AMD has found a way to design a CPU in a modular way so that the core elements may be used across a generation, and even the CCX may be used across most of the generation regardless of market.

I expect this practice to continue and expand. Custom targeted core design elements will be blended with general elements through chiplets (and within chiplets). Gaming is fairly easy for AMD to customize for as they simply slap a butt ton of L3 cache over the compute CCX ;). It just happens to be that games (most of them) thrive with lots of low latency memory access.

I would argue that the primary design considerations for future processors are scalability in performance and cost, modularity across markets, and yield on the newest processes. In these areas, AMD has lead Intel. They have done such a good job of it that they currently able to best Intel from a node behind.

These things I mention are higher level design concepts than we usually cover in forums like this. We tend to focus on the CPU element and its design elements and of course, fabrication.

The market for gaming will (IMO) gravitate more towards those custom targeted core designs, and those core designs will be integrated into consoles while the desktop PC market will continue to decrease including gaming PCs.

Of course, these are all just my speculation.
 
All good points.

Integrated graphics is rapidly approaching the "good enough" point for laptop gaming, and laptops for gaming are a tiny percent of the laptop market sold. The lions share of laptops are sold to businesses despite the fact that more people own a personal laptop now than in the past. Note: Personal laptops are ALSO being replaced by cell phones. My wife (as an example) rarely ever touches any PC at home or any laptop.

Clearly AMD has found a way to design a CPU in a modular way so that the core elements may be used across a generation, and even the CCX may be used across most of the generation regardless of market.

I expect this practice to continue and expand. Custom targeted core design elements will be blended with general elements through chiplets (and within chiplets). Gaming is fairly easy for AMD to customize for as they simply slap a butt ton of L3 cache over the compute CCX ;). It just happens to be that games (most of them) thrive with lots of low latency memory access.

I would argue that the primary design considerations for future processors are scalability in performance and cost, modularity across markets, and yield on the newest processes. In these areas, AMD has lead Intel. They have done such a good job of it that they currently able to best Intel from a node behind.

These things I mention are higher level design concepts than we usually cover in forums like this. We tend to focus on the CPU element and its design elements and of course, fabrication.

The market for gaming will (IMO) gravitate more towards those custom targeted core designs, and those core designs will be integrated into consoles while the desktop PC market will continue to decrease including gaming PCs.

Of course, these are all just my speculation.
Maybe that's why Intel has a big LLC planned with Nova Lake as for yields that credit is TSMCs to take if AMD we're to stick with GF it wouldn't have been possible and the competition is going to intensify even more now.

Despite their massive lead for few years they were gaining market share so slowly now it's going to be even difficult to capture it all that with a struggling Intel.
 
Gaming CPUs help sell the product they are marketing star even if it consumes 300W it sets the trend in Market it is basically free marketing in Desktops.

Correct, but it's not only marketing. All computer ecosystem treats gaming as an imaginary performance apex for any application. This way all trends are set for the needs of very rich enthusiast, and later propagate down.

From compiler developers, to people doing serious linear algebra number crunching, at some point they deal with the fact that the hardware is aimed at that market, and adapt for that.
 
Back
Top