Please use this identifier to cite or link to this item: https://hdl.handle.net/1959.11/30943
Full metadata record
DC FieldValueLanguage
dc.contributor.authorLoxley, Peter Nen
dc.date.accessioned2021-07-06T01:43:29Z-
dc.date.available2021-07-06T01:43:29Z-
dc.date.issued2021-02-22-
dc.identifier.citationNeurocomputing, v.426, p. 1-13en
dc.identifier.issn1872-8286en
dc.identifier.issn0925-2312en
dc.identifier.urihttps://hdl.handle.net/1959.11/30943-
dc.description.abstractSparse codes in neuroscience have been suggested to offer certain computational advantages over other neural representations of sensory data. To explore this viewpoint, a sparse code is used to represent natural images in an optimal control task solved with neurodynamic programming, and its computational properties are investigated. The central finding is that when feature inputs to a linear network are correlated, an over-complete sparse code increases the memory capacity of the network in an efficient manner beyond that possible for any complete code with the same-sized input, and also increases the speed of learning the network weights. A complete sparse code is found to maximise the memory capacity of a linear network by decorrelating its feature inputs to transform the design matrix of the least-squares problem to one of full rank. It also conditions the Hessian matrix of the least-squares problem, thereby increasing the rate of convergence to the optimal network weights. Other types of decorrelating codes would also achieve this. However, an over-complete sparse code is found to be approximately decorrelated, extracting a larger number of approximately decorrelated features from the same-sized input, allowing it to efficiently increase memory capacity beyond that possible for any complete code: a 2.25 times over-complete sparse code is shown to at least double memory capacity compared with a complete sparse code using the same input. This is used in sequential learning to store a potentially large number of optimal control tasks in the network, while catastrophic forgetting is avoided using a partitioned representation, yielding a cost-to-go function approximator that generalizes over the states in each partition. Sparse code advantages over dense codes and local codes are also discussed.en
dc.languageenen
dc.publisherElsevier BVen
dc.relation.ispartofNeurocomputingen
dc.titleA sparse code increases the speed and efficiency of neuro-dynamic programming for optimal control tasks with correlated inputsen
dc.typeJournal Articleen
dc.identifier.doi10.1016/j.neucom.2020.10.069en
local.contributor.firstnamePeter Nen
local.profile.schoolSchool of Science and Technologyen
local.profile.emailploxley@une.edu.auen
local.output.categoryC1en
local.record.placeauen
local.record.institutionUniversity of New Englanden
local.publisher.placeNetherlandsen
local.format.startpage1en
local.format.endpage13en
local.identifier.scopusid85096400116en
local.peerreviewedYesen
local.identifier.volume426en
local.contributor.lastnameLoxleyen
dc.identifier.staffune-id:ploxleyen
local.profile.orcid0000-0003-3659-734Xen
local.profile.roleauthoren
local.identifier.unepublicationidune:1959.11/30943en
local.date.onlineversion2020-11-04-
dc.identifier.academiclevelAcademicen
local.title.maintitleA sparse code increases the speed and efficiency of neuro-dynamic programming for optimal control tasks with correlated inputsen
local.output.categorydescriptionC1 Refereed Article in a Scholarly Journalen
local.search.authorLoxley, Peter Nen
local.uneassociationYesen
local.atsiresearchNoen
local.sensitive.culturalNoen
local.identifier.wosid000606709400001en
local.year.available2020en
local.year.published2021en
local.fileurl.closedpublishedhttps://rune.une.edu.au/web/retrieve/01c18ee5-0e4c-45d7-9e6b-e06bf9dad459en
local.subject.for2020460209 Planning and decision makingen
local.subject.for2020461105 Reinforcement learningen
local.subject.for2020461301 Coding, information theory and compressionen
local.subject.seo2020280115 Expanding knowledge in the information and computing sciencesen
local.subject.seo2020280102 Expanding knowledge in the biological sciencesen
Appears in Collections:Journal Article
School of Science and Technology
Files in This Item:
1 files
File SizeFormat 
Show simple item record

SCOPUSTM   
Citations

1
checked on Aug 17, 2024

Page view(s)

1,020
checked on Apr 2, 2023

Download(s)

2
checked on Apr 2, 2023
Google Media

Google ScholarTM

Check

Altmetric


Items in Research UNE are protected by copyright, with all rights reserved, unless otherwise indicated.