Solid State Technology and Devices Seminar: Conventional silicon transistor and spintronic based implementation of neural network algorithms in analog hardware

Seminar: Solid State Technology and Devices: EE: CS | July 19 | 1-2 p.m. | 250 Sutardja Dai Hall

 Debanjan Bhowmik, Professor, Department of Electrical Engineering, Indian Institute of Technology Delhi

 Electrical Engineering and Computer Sciences (EECS)

Neural Network algorithms, used extensively for classification, recognition and prediction tasks, can be executed at a higher speed and with lower energy consumption if they are implemented on specialized analog crossbar architecture which provides the advantages of parallel computation and memory- embedded computing. In this talk, I shall discuss my ongoing research on analog hardware neural network, using both spintronic devices [1,2] and conventional silicon transistors as synapses [3]. While spintronic synapses can result in extremely low energy [1,2,4,5] consuming neural networks for both on-chip and off-chip learning, conventional transistor based synapses offer the advantage of fabricating analog neural networks much more easily and are mainly meant for on-chip learning [3]. I shall discuss implementation of both non-spiking algorithms, which are widely used by the machine learning community, and spiking algorithms, which are based on biological data about neurons and synapses in the brain, in our proposed hardware systems. In addition, I shall also discuss some Quantum Neural Network algorithms that our group is working on to carry out the same kind of data classification with lower number of weight parameters and hence lesser hardware.

 dadevera@berkeley.edu, 510-642-3214