Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/index.php?threads/nvidia-is-starting-to-lose-share-to-ai-chip-startups.20956/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021370
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

Nvidia is starting to lose share to AI chip startups?

XYang2023

Active member
1726101870732.png



I then to agree with that statement. In our lab, there is only one person who uses CUDA but only when it is necessary (not often).

I use Pytorch and Llama3.1 via the API.
 
I am not sure that nvda is losing shares to ai startups, but inference is indeed much easier to be migrated away from nvda and cuda. On the other hand, large scale training is still nvda's stronghold.
 
CUDA is used for more than just AI inference. The available software base is immense and hard to compete against.

I expect the AI neural networks market long term to evolve towards specific NPUs.
 
Back
Top