Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/shrinking-neural-networks-the-great-acceleration.11366/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021770
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

Shrinking Neural Networks & the "Great Acceleration"

Arthur Hanson

Well-known member
A new method of shrinking the size of neural networks dramatically has come out of MIT and may be able to put limited AI even on a smart phone. With the number of AI/ML researchers around the world at over 300K and increasing yearly, AI/ML is the greatest research project in human history. With the semi/nanotech and robotics sector advancing in step with AI/ML and the drastic reductions in prototyping 3D printing is bringing to robotics, we should soon be seeing drastic changes in just about everything. The speed of literally everything advancing is picking up speed and those that master the changes and learn how to work with the "Great Acceleration" are going to be the winners. Just like any major project, running a multiple parallel paths instead of a linear path will speed progress in anything dramatically. The individuals that can master running parallel paths on projects will win and will probably assign AI/ML systems to develop very sophisticated project planning and probabilities that can speed progress even more. The skill of building, running and coordinating teams in this area will become a key set in this area and presents a new frontier in itself. I have set up both linear and parallel paths on physical projects and have actually enjoyed the challenge myself and will be experimenting on methods of applying it to finance. Any thoughts, experiences or references on application of structures of this type would be appreciated.


MIT CSAIL details technique for shrinking neural networks without compromising accuracy | VentureBeat

A new way to build tiny neural networks could create powerful AI on your phone - MIT Technology Review
 
Back
Top