Investigating brain-inspired spiking neural networks 

Jayakumar Fellowship allows Andrew Duncombe ’21 to design computer architectures that make the implementation of complex computing processes, like neural networks, feasible for real-world applications. 

A wealth of technological advances has occurred over the past two decades exploring the full capacity of problem solving with machine learning, from detailed object classification to self-driving cars. The high-grade, dedicated hardware usually found in a laboratory setting with large servers and supercomputers enables very fast and effective deep learning. 

Andrew Dumcombe
Andrew Duncombe explored energy-efficient hardware architecture for spiking neural networks (SNNs) over the summer.

Senior computer engineering concentrator Andrew Duncombe wants to redesign aspects of that learning to make it just as effective, but utilize less power, making it better suited to work on embedded devices. “There’s a lot of research looking into this problem,” he said, “and one of the methods that has come up, maybe in the last ten years, is actually inspired by how brains work, which are, of course, the most efficient computers we know about. 

“Except for maybe some pure math people, most people’s brains don’t run on linear algebra. They run using spikes and neural synapses,” Duncombe said. “Depending on the intensity and frequency of those incoming spikes, the neuron itself will fire and spike out and send its information to other neurons.” It is on this premise that spiking neural networks (SNNs), a general category of artificial neural networks that aim to emulate or mimic biology at the network level, is being investigated. SNNs use spikes for communication between neurons, the interconnected computing elements which make up the network. “This information is not encoded as integers, but spikes transmitted between neurons. This greatly reduces the complexity of the computations we need to perform,” Duncombe explained. 

As a recipient of the Sriram Jayakumar ’13 Summer Research Fellowship, Duncombe was able to enter this space over the summer, bolstered by his work in Professor Sherief Reda’s Scalable Energy- Efficient Computing Laboratory. He was able to implement SNNs at the low level, or chip level, taking existing implementations and paring them down. Furthermore, he found he could make improvements to the system, particularly by using approximate circuitry. 

“Approximate circuitry produces inexact results, but with significant reductions in energy consumption. This is especially valuable for spiking neural networks,” he said. “Since there’s a little bit of error because the output is so binary – it’s either a spike or not a spike – you have a good amount of freedom with how you perform internal computations, as long as the final output either spikes or doesn’t spike.” 

Duncombe’s interest in this project began in Reda’s class ENGN 2911Q (Advanced Digital Design) in the fall of 2019, where he had some success reducing energy consumption in a course project. But it was the combination of studying abroad and COVID-19 that opened the door for the 2020 summer fellowship. Duncombe studied in Ireland last spring, and had already accepted a summer industry position before the pandemic canceled that endeavor. “I was fortunate to be nominated for, and awarded the Jayakumar fellowship to be able to work on this project remotely from my home in Wisconsin,” he said. 

“Eventually, I want to build an actual open source system that other researchers can work on and develop. That’s still in the future, but a lot of the early academic results, the high level results, we submitted to the upcoming Design Automation Conference. I’ll be first author on this paper, primarily showing the work I’ve done because of the Jayakumar award,” said Duncombe, who also plans to turn the research into an honors thesis for 2021. 

The fellowship is given in memory of Jayakumar to an undergraduate to help fund summer research and materials, and is especially meaningful to Reda, who worked with Jayakumar from his freshman through senior years. 

An honors thesis would cap a collegiate resume for Duncombe that was carefully constructed to not only provide him with engineering experience and knowledge, but also allow the opportunity to take advantage of other university experiences. He was a member of Brown Space Engineering when EQUISat, Brown’s small satellite deployed from the International Space Station, was launched in May of 2018. Duncombe worked among a team of first-years to develop a physical ground station placed at the Ladd Observatory to receive transmissions. He also spent time as a member of Brown’s car team, and has been known to dabble in the Brown Design Workshop, patterning the work of artist Martin Tomsky, who uses laser cutting, mills and lathes to create interesting, complex artscapes. “I really like him so I’ve tried to recreate some of his work. I’m not super artistic, but I’m really good at copying and manufacturing things,” Duncombe said. 

“If you do engineering at a lot of other places, that’s the only thing you do and it’s very difficult to take classes outside of engineering. But at Brown, even in terms of engineering education, you are able to study abroad and do other things - take art classes and history classes. It is certainly a rare thing that Brown provides that was important to me.”