Abstracts

Smart as a Bug: A Computational Model of Learning in the Moth Olfactory Network, with Applications to Neural Nets

by Charles Boise Delahunt




Institution: University of Washington
Department:
Year: 2018
Keywords: computational modeling; machine learning; moth olfactory network; neural injury; reinforcement learning; sparsity; Neurosciences; Computer science; Electrical engineering
Posted: 02/01/2018
Record ID: 2215971
Full text PDF: http://hdl.handle.net/1773/40881


Abstract

The moth olfactory network is one of the simplest biological neural systems capable of Learning. It is thus ideal for exploring how learning occurs. The network, which includes the antenna lobe, mushroom body, and ancillary structures, contains several key structural motifs widespread in biological neural systems and of great interest. These include cascading networks, large dimension shifts from stage to stage, high-dimensional sparse codings of data, randomness, Hebbian ("fire together, wire together'') plasticity, and octopamine stimulation as a vital part of the learning mechanism. While these components are widespread in natural neural systems, they are largely absent from the engineered neural nets of machine learning. This thesis has three goals: To characterize the various components of the moth's olfactory system and how they enable it to learn; to port the moth's ``bag of tricks'' to machine learning contexts; and to examine learning as an injury mitigation mechanism. Our approach is to build a full computational model of the moth olfactory system with the following properties: Its structure and mechanics are tightly tethered to current knowledge of the moth olfactory system; its behavior statistically matches experimental data; and it is able to robustly learn new odors. To our knowledge this is the first full neural network model that is tightly tied in structure and behavior to a real biological system, and that also demonstrates learning behavior. From a Biology perspective, the model is a valuable platform to examine how key structural features enable learning in nature. For example, we analyse the role of octopamine stimulation and the functions of high-dimensional sparse network stages in learning. The model also allows predictions about structural details of the olfactory system that are not currently well-characterized. In addition, we explore the role of learning, and other structural features, as injury mitigation mechanisms. From a Machine Learning perspective, the model allows us to identify promising structures and tools that can be ported to ML systems. For example, it offers bio-mimetic solutions to two open concerns in human-built Neural Nets: It uses a biologically-plausible optimization method to train the network, potentially bridging a long-standing gap between natural and human-built neural nets; and it requires few training samples, offering a potential means to address a current bottleneck in use of neural nets, viz their vast appetite for training data.Advisors/Committee Members: Kutz, Jose N (advisor), Riskin, Eve (advisor).