Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/index.php?threads/has-intel-stumbled-again.15965/page-3
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021370
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

Has Intel Stumbled Again?

Intel performance was based on their process lead. I only use three things on my labtop zoom or skype, word processsing, and the browser. The ryzen 7 is better for what I need. Intel goes it alone while amd uses the ecosystem. I'm not paying $1000 for a laptop with a celeron processor, 10th or 11th generation processor and another $60 for microsoft. I5 and i7 is good but that's as much as a macbook. If you're paying for intel's peak performance it's a short time frame until something better comes out.. Intel is living off their legacy and customers that don't know better.
This situation has nearly always been the case. In the 1990s if you only used a personal computer for DOS word processing and dialing BBSes any clone or "last year chip" would perform well.

OTOH if you're buying for employees there's more likely a chance that more performance might mean more productivity. Scenario 2, like today was some people bought the "best" because they thought it might buy them another year before they "had" to upgrade.

The only difference between now and then is application performance is improving more slowly over time. (Due to a lot of reasons).
 
This situation has nearly always been the case. In the 1990s if you only used a personal computer for DOS word processing and dialing BBSes any clone or "last year chip" would perform well.
Not really, quality over zoom is important and amd far surpasses intel with this years model so amd of this year vs intel of last year would be worse. I'm willing to pay extra for the comfort of extra performance the reality is this year amd products are cheaper than last year intel's. Intel is losing on cost and quality.

OTOH if you're buying for employees there's more likely a chance that more performance might mean more productivity. Scenario 2, like today was some people bought the "best" because they thought it might buy them another year before they "had" to upgrade.

The only difference between now and then is application performance is improving more slowly over time. (Due to a lot of reasons).
In terms of nanometers you're right in terms of innovation of software and architecture from the fabless ecosystem its never been faster. It's so fast the public can't keep up with the innovations. People now need to replace pcs and laptops based on the latest architecture not just the node size.

Intel will need to reinvent itself.
 
The only difference between now and then is application performance is improving more slowly over time. (Due to a lot of reasons).
Which applications are you referring to? Web browsers, for example, are far more complex and resource hungry than they used to be. Due to the proliferation of video at increasingly high definition levels web browsers now have "helper" and "renderer" threads, and they need quite a bit of memory throughput and size. In Microsoft's case Edge runs as multiple processes, to isolate some web apps and plug-ins for security reasons. Security software is very CPU and memory intensive, and seems to get worse every year. I use McAfee stuff for Windows and MacOS, and it is a CPU and memory hog. Not to mention the OS itself. They get bigger with every release, it seems. I suppose word processing, speadsheets, and email have mostly modest needs, but many users play games, edit videos, edit photos, and use other apps that are very resource hungry. Even 3D printing apps, like my son in law. I don't use those apps, especially games, but I know a lot of people who do. A newer CPU can make an easily discernible difference to them.

To me, however, the latest I/O is much more important. Nothing speeds up a system like multiple DDR5 DIMMs (or integrated in the package, a la M-series Macs), PCIe Gen4 SSDs, and the best WiFi support. And to get this stuff you probably need to buy a new CPU. (One trend to watch out for, especially with DDR4, is that 16GB DIMMs are available, so you can end up with 16GB of memory on a single DDR4 channel, which means you'll get half the effective memory bandwidth compared to two 8GB DIMMs on two channels. I was advising a relatively non-technical friend about a new laptop he wanted to use for light gaming, among other apps, and was surprised to see a single DDR4 DIMM. On my suggestion he ordered 32GB of memory to use both memory channels, which he claims was a good idea anyway, since once I showed him how to monitor DRAM utilization he found he's often using more than 25GB a lot.)

I see your point about some users being satisfied with previous generation client hardware, but that decision, IMO, takes some thought. Sometimes cheaper stuff is cheaper for a reason.
 
Last edited:
This situation has nearly always been the case. In the 1990s if you only used a personal computer for DOS word processing and dialing BBSes any clone or "last year chip" would perform well.

OTOH if you're buying for employees there's more likely a chance that more performance might mean more productivity. Scenario 2, like today was some people bought the "best" because they thought it might buy them another year before they "had" to upgrade.

The only difference between now and then is application performance is improving more slowly over time. (Due to a lot of reasons).

We have to talk about market drivers:

Why in the world have people not remained on i386 level of performance if it's enough for virtually everything an average user really needs?

The single biggest mover is Windows: eventually a new Windows version comes, which necessitates an upgrade for users to be able to use new software. Each Windows version is more bloated than previous one.

Second driver is web browsing: JS gets ever more ridiculous, so much so that Apple even pushed few instructions into ARMV9 to hardware accelerate it.

Third: Videogames - videogame engines get more, and more atrocious when it comes to CPU usage efficiency with each generation.

There is absolutely zero impact from AI, Bitcoin, Cloud, etc for 99.9999% of users. In fact, the slowdown in CPU performance improvements not being felt as much recently is mainly due to desktop software stopping box sales almost completely, and "evergreening" their installations.

The development budgets in big semi companies went so extremely into non-money making (but stock market happy making) buzzword areas like AI accelerators, bitcoin chips, non-gaming GPUs, that I bet they are due for a very rude awakening when they will get stuck with super-expensive dies they can't sell to consumer sector when the current tech bubble will deflate.

You can literally make money selling 3-4 generations old CPUs today because, well there are not enough of consumer dies, and people do pay for things like laptops, and PCs. Making cheaper CPUs on latest processes should be the priority.
 
Last edited:
Back
Top