Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/index.php?threads/arent-amds-resources-being-stretched-too-thin.18556/page-2
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021370
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

Aren't AMD's resources being stretched too thin?

AMD is absolutely resource limited - they’re still a much smaller company than Intel or Nvidia, despite years of growth. However, they’re definitely pivoting their investments to where it makes the most sense. Back in 2013-2016 era, they chose to starve the GPU division to focus on CPUs, and that’s what resulted in Zen 1 coming out and being greatly successful. It takes 4-6 years to get a new CPU or GPU from idea to production for reference.

That “Radeon division diet” clearly persisted where AMD Polaris (Rx 400, Rx 500 series) had to compete against multiple generations of Nvidia, and essentially stayed entry level (or low midrange. Rx 5700XT was also limited to mid range despite being a true next gen architecture from them. That they’re in the game at all is a testament to their engineers - I don’t have the numbers, but I believe they’re investing like 10% of the engineering in their GPUs that Nvidia is, so the fact that they have something competing in the $700-1000 range at all is impressive.

I do worry they’re a little too focused on bean counting and not going for more growth overall - though you can see they’re taking the server and enterprise market by storm. AMD made a mistake of not second sourcing capacity (they talked about it but did nothing) in the Athlon 64/X2 days (2003-2006) and they capped at like 30% market share for desktop and mobile. When Core 2 came out, they had no options for reducing capacity that weren’t expensive, and also less mindshare than they might have had otherwise.

I think overall AMD is in the strongest position it’s ever been in it’s entire history - through a combination of smart investment, excellent engineering discipline, and some luck too. (Intel mis-stepping on nodes and products, COVID shortages providing opportunity for “sell every GPU at a good price” helped but weren’t the sole reason for AMD’s rise).


"AMD is absolutely resource limited - they’re still a much smaller company than Intel or Nvidia, despite years of growth. "

Are you measuring these three companies by their revenue, market cap, or headcount?
 
"AMD is absolutely resource limited - they’re still a much smaller company than Intel or Nvidia, despite years of growth. "

Are you measuring these three companies by their revenue, market cap, or headcount?
Headcount. (Thinking of Engineering pool)

Intel 110-130K
AMD 25K
Nvidia 26K

OK Nvidia is smaller than I thought - but AMD is trying to compete with both Nvidia and Intel at the same time.. only a fraction are working on GPUs, and majority on CPUs if I understand where their investments are going.
 
I think AMD getting rid of their fabs was a key pivot. Had they not done that I don't believe they would be in business much less an industry leader. The same can be said about AMD's partnership with TSMC. That was a HUGE pivot and is the foundation of their current success. I also believe the acquisition of Xilinx was an important part of AMD's success with the TSMC partnership. Xilinx had one of the best foundry groups and a very close relationship with TSMC. Xilinx and TSMC co-developed CoWaS, right?

Raw revenue numbers don't mean a lot without knowing inventory levels.

What I can see myself. AMDs zen 4 mobile sales were significantly delayed to sell off remaining zen 3 based laptop chips in a more orderly way than a flash sale.

And this year, we seen almost no brand new laptop models on Computex, only respins and refreshes.

It's a good time to get a last year laptop on sale. Laptop makers will not be significantly updating the lineup for the next year or two.
 
NVidia is a more serious threat to AMD in GPU than Intel is to them in CPU. I agree with that assessment. But saying they are at risk of going out of business in client is hyperbole.
 
Sorry TICin, I will withhold that. It is possible that some of the professors are doing a good job at teaching the concept, but the students are only interested in knowing what is on the test. After 20 years of bringing on MSEEs, I still haven't completely figured that out. I will tell you that the newbies are getting worse every year. It makes no sense. Maybe there is too many funny videos out there to distract the yuts. Dunno.
My bud is an adjunct prof at a uni out here. He relates similar rates of non-readiness in his incoming students. With an inverse correlation of grade grubbing. The unis want the gold and don't care how the students do. Just one step of many in the failures in the pipelines.

Every firm is in danger of losing their best to the competition. Then they must find replacements. Different filters for different firms. The experiment will continue....
 
My bud is an adjunct prof at a uni out here. He relates similar rates of non-readiness in his incoming students. With an inverse correlation of grade grubbing. The unis want the gold and don't care how the students do. Just one step of many in the failures in the pipelines.

Every firm is in danger of losing their best to the competition. Then they must find replacements. Different filters for different firms. The experiment will continue....
My take is different - you won’t find the best and the brightest in most electrical engineering programs. The best in STEM are all vectoring to computer science. I have two kids who graduated from the EECS program at Berkeley. They did the mandatory EE classes, but no more. One did one EE-related internship. Both specialized in the computer science part of the program for their last couple years. Both kids had friends who were off-the-chart brilliant - all of them went the CS route because of market demand and interest in machine learning and AI. Colleges are graduating some really talented kids, but they mostly aren’t the ones doing EE.
 
My take is different - you won’t find the best and the brightest in most electrical engineering programs. The best in STEM are all vectoring to computer science. I have two kids who graduated from the EECS program at Berkeley. They did the mandatory EE classes, but no more. One did one EE-related internship. Both specialized in the computer science part of the program for their last couple years. Both kids had friends who were off-the-chart brilliant - all of them went the CS route because of market demand and interest in machine learning and AI. Colleges are graduating some really talented kids, but they mostly aren’t the ones doing EE.
Agree with this take.

Anyone smart who cares about getting a high salary is going into CS, not EE, and absolutely not going into manufacturing. Manufacturing companies are very out of touch. Tech companies pay senior engineers $300-400k and tons of perks. Manufacturing industry pays senior engineers around $120k, unpaid OT and tough work environment, and wonder why they can't attract the smartest people.
 
Interesting. Probably true. We only employ new MSEEs. We stick to experienced C++ developers.
Man, the world has changed. C++ was the hot new thing when I started working. The software world for both of my kids today is Python, microservices stacks (including Airflow, Arrow, Spark), CUDA and various huge data storage systems / SQLs.
 
Agree with this take.

Anyone smart who cares about getting a high salary is going into CS, not EE, and absolutely not going into manufacturing. Manufacturing companies are very out of touch. Tech companies pay senior engineers $300-400k and tons of perks. Manufacturing industry pays senior engineers around $120k, unpaid OT and tough work environment, and wonder why they can't attract the smartest people.

When I was in college there was MANY more EE jobs than CS. It is the complete opposite today and CS jobs pay very well.
 
Man, the world has changed. C++ was the hot new thing when I started working. The software world for both of my kids today is Python, microservices stacks (including Airflow, Arrow, Spark), CUDA and various huge data storage systems / SQLs.
We are an EDA company, so the interaction in the editor has to be fast. Also, layouts have lots of little rectangles. The industry database (open access) uses C++. We could use a Python for lots of scripts, but our developers just use C++ and bash scripts. They decide.

Most of us are still using manual transmissions.
 
It's interesting - C++ is a much more complex language than Python or JS, or even Java. But C++ developers actually make less since it's a less in demand skillset.

These things tend to correct themselves in the longer term. I'd be bracing for a time when C++ developers start also demanding $300k+ salaries.
 
It's interesting - C++ is a much more complex language than Python or JS, or even Java. But C++ developers actually make less since it's a less in demand skillset.

These things tend to correct themselves in the longer term. I'd be bracing for a time when C++ developers start also demanding $300k+ salaries.
I've read the statistics on the internet about C++ software engineers making only $100K or so. For experienced systems programmers in the Bay Area I doubt you can hire anyone with a decent C++ product development background for less than $200K. Ace C++ developers, who often have titles like principal engineer, make double that and more. I had two C++ development teams in my group at my last job, and it's a rare skillset to use the language efficiently. Intel is making C++ a more valuable skill with its OneAPI strategy, which is based on an extended version of C++ called Data Parallel C++.

Personally, I dislike C++. It is too easy to use it improperly and compromise debug-ability and performance. Linus Torvalds dislikes it too; he called it "a terrible language".
 
Last edited:
I've read the statistics on the internet about C++ software engineers making only $100K or so. For experienced systems programmers in the Bay Area I doubt you can hire anyone with a decent C++ product development background for less than $200K. Ace C++ developers, who often have titles like principal engineer, make double that and more. I had two C++ development teams in my group at my last job, and it's a rare skillset to use the language efficiently. Intel is making C++ a more valuable skill with its OneAPI strategy, which is based on an extended version of C++ called Data Parallel C++.

Personally, I dislike C++. It is too easy to use it improperly and comprise debug-ability and performance. Linus Torvalds dislikes it too; he called it "a terrible language".
I wouldn't call it a terrible language. It's a terrible general purpose programming language for sure. But when you need to do things like manipulate memory or the call stack precisely for performance reasons, it's very useful. The problem is it requires a VERY good understanding of what the computer is actually doing to use properly and be very careful about things like memory management because C++ is extremely unforgiving.

C++ is a great language for embedded systems with small codebases that require a lot of performance and are not very complex. In embedded systems you are expected to have a good understanding of the underlying hardware as well.
 
Good discussion! I get that CS will attract the best since it's paying better.

But it leaves me wondering, EE is still important, too. Will the pay pendulum swing back hard, or will EE salaries just 'catch up' with CS? Or will EE work just get completely automated? (I personally think not, on this last question.)
 
Good discussion! I get that CS will attract the best since it's paying better.

But it leaves me wondering, EE is still important, too. Will the pay pendulum swing back hard, or will EE salaries just 'catch up' with CS? Or will EE work just get completely automated? (I personally think not, on this last question.)
I haven't seen lower EE salaries at comparable companies. For example, Intel, and two of my other employers. If anything, Analog EEs seem to get a premium.
 
AMD is absolutely resource limited - they’re still a much smaller company than Intel or Nvidia, despite years of growth. However, they’re definitely pivoting their investments to where it makes the most sense. Back in 2013-2016 era, they chose to starve the GPU division to focus on CPUs, and that’s what resulted in Zen 1 coming out and being greatly successful. It takes 4-6 years to get a new CPU or GPU from idea to production for reference.

That “Radeon division diet” clearly persisted where AMD Polaris (Rx 400, Rx 500 series) had to compete against multiple generations of Nvidia, and essentially stayed entry level (or low midrange. Rx 5700XT was also limited to mid range despite being a true next gen architecture from them. That they’re in the game at all is a testament to their engineers - I don’t have the numbers, but I believe they’re investing like 10% of the engineering in their GPUs that Nvidia is, so the fact that they have something competing in the $700-1000 range at all is impressive.

I do worry they’re a little too focused on bean counting and not going for more growth overall - though you can see they’re taking the server and enterprise market by storm. AMD made a mistake of not second sourcing capacity (they talked about it but did nothing) in the Athlon 64/X2 days (2003-2006) and they capped at like 30% market share for desktop and mobile. When Core 2 came out, they had no options for reducing capacity that weren’t expensive, and also less mindshare than they might have had otherwise.

I think overall AMD is in the strongest position it’s ever been in it’s entire history - through a combination of smart investment, excellent engineering discipline, and some luck too. (Intel mis-stepping on nodes and products, COVID shortages providing opportunity for “sell every GPU at a good price” helped but weren’t the sole reason for AMD’s rise).
Exactly. I wonder if maybe they shouldn't borrow more money. They are definitely walking a very narrow line in trying to make their operations extremely efficient and on the other hand being starved. Radeon for example is a good example you mentioned. They are definitely investing more, they have to be. But it's just not enough to catch up to Nvidia yet.
 
Back
Top