Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/index.php?threads/survey-of-spintronic-architectures-for-processing-in-memory-and-neural-networks.10950/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021370
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

Survey of Spintronic Architectures for Processing-in-Memory and Neural Networks

sparsh

Member
Spintronic memories such as STT-RAM (spin transfer torque RAM), SOT-RAM (spin orbit torque RAM) and DWM (domain wall memory) facilitate efficient implementation of PIM (processing-in-memory) approach and NN (neural network) accelerators and offer several advantages over conventional memories.


Our survey reviews 75+ papers on spintronic-architectures for PIM and NNs. This paper will be useful for researchers in the area of artificial intelligence, hardware architecture, chip design and memory system.


Accepted in Journal of Systems Architecture 2018. Available here.
 
Spintronic memories such as STT-RAM (spin transfer torque RAM), SOT-RAM (spin orbit torque RAM) and DWM (domain wall memory) facilitate efficient implementation of PIM (processing-in-memory) approach and NN (neural network) accelerators and offer several advantages over conventional memories.


Our survey reviews 75+ papers on spintronic-architectures for PIM and NNs. This paper will be useful for researchers in the area of artificial intelligence, hardware architecture, chip design and memory system.


Accepted in Journal of Systems Architecture 2018. Available here.

Access at Research Gate could be easier: (PDF) A Survey of Spintronic Architectures for Processing-in-Memory and Neural Networks
 
Thank you for sharing! This is certainly a great work, to be honest, I didn't carefully read each of the 37 pages document... Do you think that it could make sense to build a 2 pages summary out of this great work?
Anyway, please feel free to share your work!
 
Thanks for your suggestion and encouragement. But I think summarizing it in 2 pages will be very difficult.
 
Back
Top