Condensed Matter Physics Seminar


Introduction to Neuromorphic Computing: Parts I, II, and III (3 sessions)

In this set of three lectures, we explore ideas underlying modern research into brain-inspired, nanodevice-driven computational hardware.

  • In Part I, we introduce artificial neural networks, which underlie much of today’s machine learning and computer vision tasks. We also explore how resistive crossbar architectures are used to carry out neural network inference using Kirchoff’s laws.
  • In Part II, we explore discuss recurrent neural networks and reservoir computing. We also discuss how time-division multiplexing has been used to implement these ideas in certain magnetic systems.
  • Finally, in Part III, I discuss stochastic computing, and close our three-day tutorial with an attempt to grasp the current landscape of neuromorphic computing. This includes an outline of where and how novel materials and magnetic systems have key roles to play going forward.

Attendees are encouraged to bring pen and paper, as there may be some interactive problems to work through during the lectures.


  1. Part I:  Wednesday, July 17 — 1:00 pm, Wean Hall 7316
  2. Part II:  Thursday, July 18 — 1:00 pm, Wean Hall 7316
  3. Part III:  Friday, July 19 — 1:00 pm, Wean Hall 7316

Matthew Daniels is a CNST/UMD Postdoctoral Researcher in the Electron Physics Group. He received a B.S. in physics from Clemson University and a Ph.D. in physics from Carnegie Mellon University. For his doctoral research, he developed a semiclassical formalism for exposing a novel, spin-like degree of freedom in antiferromagnetic spin waves. The work enables researchers to use antiferromagnetic insulators to perform semiclassical quantum computations. Matthew is working with Mark Stiles on theoretical models for neuromorphic computing with spintronic devices, specifically on networks of spin-torque oscillators.

For More Information, Please Contact: 

Catherine Copetas,