Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/nvidia-and-ai.7417/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021770
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

Nvidia and AI

Arthur Hanson

Well-known member
It looks like the data center business may be making a critical turn in the direction of AI. This could cause a major change in semis and the cloud industry by building a large gap between high powered data centers and everything else. It isn't hard to imagine thousands of these cores tied together and what future they may hold. Any thoughts and comments were this could lead and what time line to expect would be appreciated.

Nvidia’s Graphics Chips for AI, Not Just Gaming - Bloomberg Business
 
GP-GPUs are good for a lot of things, I'd like to try them for fast analog-like simulation, but it's not easy to get code onto them. It's particularly difficult to get old code to run on them efficiently, so stuff usually needs rewritten by specialist programmers - unless you know fix the compiler chain and runtime to do it transparently ;-)

http://parallel.cc
 
Back
Top