Please use this identifier to cite or link to this item: https://hdl.handle.net/1959.11/30943
Title: A sparse code increases the speed and efficiency of neuro-dynamic programming for optimal control tasks with correlated inputs
Contributor(s): Loxley, Peter N  (author)orcid 
Publication Date: 2021-02-22
Early Online Version: 2020-11-04
DOI: 10.1016/j.neucom.2020.10.069
Handle Link: https://hdl.handle.net/1959.11/30943
Abstract: Sparse codes in neuroscience have been suggested to offer certain computational advantages over other neural representations of sensory data. To explore this viewpoint, a sparse code is used to represent natural images in an optimal control task solved with neurodynamic programming, and its computational properties are investigated. The central finding is that when feature inputs to a linear network are correlated, an over-complete sparse code increases the memory capacity of the network in an efficient manner beyond that possible for any complete code with the same-sized input, and also increases the speed of learning the network weights. A complete sparse code is found to maximise the memory capacity of a linear network by decorrelating its feature inputs to transform the design matrix of the least-squares problem to one of full rank. It also conditions the Hessian matrix of the least-squares problem, thereby increasing the rate of convergence to the optimal network weights. Other types of decorrelating codes would also achieve this. However, an over-complete sparse code is found to be approximately decorrelated, extracting a larger number of approximately decorrelated features from the same-sized input, allowing it to efficiently increase memory capacity beyond that possible for any complete code: a 2.25 times over-complete sparse code is shown to at least double memory capacity compared with a complete sparse code using the same input. This is used in sequential learning to store a potentially large number of optimal control tasks in the network, while catastrophic forgetting is avoided using a partitioned representation, yielding a cost-to-go function approximator that generalizes over the states in each partition. Sparse code advantages over dense codes and local codes are also discussed.
Publication Type: Journal Article
Source of Publication: Neurocomputing, v.426, p. 1-13
Publisher: Elsevier BV
Place of Publication: Netherlands
ISSN: 1872-8286
0925-2312
Fields of Research (FoR) 2020: 460209 Planning and decision making
461105 Reinforcement learning
461301 Coding, information theory and compression
Socio-Economic Objective (SEO) 2020: 280115 Expanding knowledge in the information and computing sciences
280102 Expanding knowledge in the biological sciences
Peer Reviewed: Yes
HERDC Category Description: C1 Refereed Article in a Scholarly Journal
Appears in Collections:Journal Article
School of Science and Technology

Files in This Item:
1 files
File SizeFormat 
Show full item record

SCOPUSTM   
Citations

1
checked on Aug 17, 2024

Page view(s)

1,020
checked on Apr 2, 2023

Download(s)

2
checked on Apr 2, 2023
Google Media

Google ScholarTM

Check

Altmetric


Items in Research UNE are protected by copyright, with all rights reserved, unless otherwise indicated.