Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/google-in-talks-with-marvell-to-build-new-ai-chips-the-information-reports.24966/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2031070
            [XFI] => 1060170
        )

    [wordpress] => /var/www/html
)

Google in talks with Marvell to build new AI chips, The Information reports

Daniel Nenni

Founder
Staff member
1776695921993.png


April 19 (Reuters) - Alphabet's Google (GOOGL.O), is in talks with Marvell Technology (MRVL.O), to develop two ‌new chips aimed at running AI models more efficiently, The Information reported on Sunday citing two people with knowledge of the discussions.

One of the chips is a ⁠memory processing unit designed to work with Google's tensor processing unit (TPU) and the other chip is a new TPU built specifically for running AI models, the report said.

Google has been pushing to make its TPUs a viable alternative to Nvidia's dominant GPUs. TPU sales have ‌become ⁠a key driver of growth in Google's cloud revenue as it aims to show investors that its AI investments are generating returns.

Reuters could not ⁠immediately verify the report. Google and Marvell did not immediately respond to a request for a comment.

The ⁠companies aim to finalize the design of the memory processing unit as soon as ⁠next year before handing it off for test production, according to the report.

 
Google has been using ASIC providers since day one.

Two questions: Why don't they just do it themselves? Why would Google switch providers after working with AVGO for so many years?

IP was a big reason why companies chose ASIC providers. Avago used to have the best Serdes so Google worked with them for many years. Price is also a big reason for changing ASIC providers but it has to be a big difference since trust is also a big thing.

Maybe Google is looking for supply chain resilience? Google could also be using Marvell as a pricing leverage for Broadcom? The ASIC business is a difficult one.
 
Thousands of CEOs admit AI had no impact on employment or productivity—and it has economists resurrecting a paradox from 40 years ago

In 1987, economist and Nobel laureate Robert Solow made a stark observation about the stalling evolution of the Information Age: Following the advent of transistors, microprocessors, integrated circuits, and memory chips of the 1960s, economists and companies expected these new technologies to disrupt workplaces and result in a surge of productivity. Instead, productivity growth slowed, dropping from 2.9% from 1948 to 1973, to 1.1% after 1973....

https://fortune.com/article/why-do-...-having-impact-productivity-employment-study/
 
We found that industries in states that were more exposed to AI experienced faster productivity growth beginning in 2021 – before ChatGPT reached the public – driven by enterprise tools already embedded in professional workflows, including GitHub Copilot for software development, Jasper for marketing and content writing, and Microsoft’s GPT-3-powered business applications. In 2024, for example, industries whose AI exposure was one standard deviation higher saw gains of 10% in productivity, 3.9% in jobs and 4.8% in wages than comparable industries in the same state.

https://theconversation.com/industr...ity-gains-but-jobs-and-wage-growth-too-224487
 
I think most experts now say that both the minicomputer/PC and internet driven-productivity rises both happened after the initial commercial booms for various reasons.

Perplexity answer below about computer / PC and Solow Productivity Paradox:

The productivity impact of the PC is one of the most studied and debated questions in economics. Here’s the full arc:
The Paradox Phase (1970s–early 1990s)
The PC era began with a striking puzzle. As computing capacity in the U.S. increased a hundredfold during the 1970s and 1980s, labor productivity growth actually slowed — from over 3% annually in the 1960s down to roughly 1% in the 1980s. This prompted Robert Solow’s famous 1987 quip: “You can see the computer age everywhere but in the productivity statistics” — which economists named the Solow Productivity Paradox.[wikipedia +1]
Why the Gains Were Hard to See
Several explanations emerged:
• Measurement failure — Much computer investment flowed into services (finance, insurance, legal) where output is notoriously hard to measure, so gains were real but invisible in national statistics[stlouisfed]
• Learning lags — Studies showed it took 2–5 years for IT investments to meaningfully impact organizations, creating a “productivity J-curve”[wikipedia]
• Capital substitution — Most firms were substituting cheap computers for expensive labor and older capital, which shows up as capital deepening rather than total factor productivity (TFP) growth
• Small share of GDP — Computers were a modest fraction of total investment, so even enormous growth in computer capital translated to small aggregate output effects
The Payoff Arrives (mid-1990s onward)
The paradox largely resolved in the 1990s. In computer-intensive manufacturing sectors, labor productivity growth jumped to 5.7% per year from 1990–1996, versus just 2.6% in non-computer-using manufacturing sectors. The computer-producing sector alone — less than 3% of private GDP — was responsible for one-third of total U.S. TFP growth in the 1980s, as Moore’s Law drove costs down 17%+ per year.[issues]
NBER research also found that PC adoption had modest positive effects on wages and employment overall, though it restructured jobs significantly — raising skill requirements and shifting demand toward college-educated workers.[nber]
The Bottom Line
The PC’s productivity impact was large but delayed by roughly 15–20 years, requiring organizational restructuring, workforce upskilling, and sufficient diffusion before aggregate statistics reflected the gains. The electricity dynamo analogy is apt — electrification similarly took decades to show up in productivity data, as firms had to redesign factories around the new technology. This historical pattern is now the primary framework economists use when debating whether generative AI will follow the same trajectory.
 
Maybe Google is looking for supply chain resilience? Google could also be using Marvell as a pricing leverage for Broadcom? The ASIC business is a difficult one.
I would agree on all three reasons:
* develop and leverage serdes knowledge more broadly which:
* enables pricing leverage
* and a broader, more resilient supply chain.

We’re in a world where scale up, scale out and scale all-over point to point connectivity seems to be as important as core AI processors, memory as well as storage hierarchy. And they all have to work together cohesively under the model/software and applications stack for extreme-cooptimization
 
Google has been using ASIC providers since day one.

Two questions: Why don't they just do it themselves? Why would Google switch providers after working with AVGO for so many years?

IP was a big reason why companies chose ASIC providers. Avago used to have the best Serdes so Google worked with them for many years. Price is also a big reason for changing ASIC providers but it has to be a big difference since trust is also a big thing.

Maybe Google is looking for supply chain resilience? Google could also be using Marvell as a pricing leverage for Broadcom? The ASIC business is a difficult one.

Along with its internal ASIC development team, Google now works with at least four external ASIC partners: Broadcom, MediaTek, Intel, and a new addition - Marvell. Given Google’s scale, volume, and diverse needs, that’s a reasonable approach.
 
Back
Top