Biologically Inspired Task Abstraction and Generalization Models of Working Memory

dc.contributor.advisor Phillips, Joshua
dc.contributor.author Jovanovich, Michael P.
dc.contributor.committeemember Phillips, Joshua
dc.contributor.committeemember Li, Cen
dc.contributor.committeemember Barbosa, Sal
dc.contributor.department Computer Science en_US
dc.date.accessioned 2018-01-04T20:26:50Z
dc.date.available 2018-01-04T20:26:50Z
dc.date.issued 2017-10-27
dc.description.abstract We first present a model of working memory that affords generalization. By separating stimuli in such a way that filler representations may flow through the model based on the state of gates, which are opened or closed in response to role signals, an action selection network is afforded the ability to learn a response to fillers that is independent of the roles in which they were encountered. Next, we present n-task learning, an extension of temporal difference learning that allows for the formation of multiple policies based around a common set of sensory inputs. In order to allow for state inputs to take on multiple values, they are joined with an arbitrary input called an abstract task representation. Task performance is shown to converge to optimal for a dynamic categorization problem in which input features are identical across all tasks.
dc.description.degree M.S.
dc.identifier.uri http://jewlscholar.mtsu.edu/xmlui/handle/mtsu/5561
dc.publisher Middle Tennessee State University
dc.subject Computational neuroscience
dc.subject Machine learning
dc.subject Reinforcement learning
dc.subject Working memory
dc.subject.umi Computer science
dc.subject.umi Artificial intelligence
dc.subject.umi Neurosciences
dc.thesis.degreegrantor Middle Tennessee State University
dc.thesis.degreelevel Masters
dc.title Biologically Inspired Task Abstraction and Generalization Models of Working Memory
dc.type Thesis
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
Jovanovich_mtsu_0170N_10891.pdf
Size:
1.24 MB
Format:
Adobe Portable Document Format
Description:
Collections