Please use this identifier to cite or link to this item:
https://hdl.handle.net/1959.11/30943
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Loxley, Peter N | en |
dc.date.accessioned | 2021-07-06T01:43:29Z | - |
dc.date.available | 2021-07-06T01:43:29Z | - |
dc.date.issued | 2021-02-22 | - |
dc.identifier.citation | Neurocomputing, v.426, p. 1-13 | en |
dc.identifier.issn | 1872-8286 | en |
dc.identifier.issn | 0925-2312 | en |
dc.identifier.uri | https://hdl.handle.net/1959.11/30943 | - |
dc.description.abstract | Sparse codes in neuroscience have been suggested to offer certain computational advantages over other neural representations of sensory data. To explore this viewpoint, a sparse code is used to represent natural images in an optimal control task solved with neurodynamic programming, and its computational properties are investigated. The central finding is that when feature inputs to a linear network are correlated, an over-complete sparse code increases the memory capacity of the network in an efficient manner beyond that possible for any complete code with the same-sized input, and also increases the speed of learning the network weights. A complete sparse code is found to maximise the memory capacity of a linear network by decorrelating its feature inputs to transform the design matrix of the least-squares problem to one of full rank. It also conditions the Hessian matrix of the least-squares problem, thereby increasing the rate of convergence to the optimal network weights. Other types of decorrelating codes would also achieve this. However, an over-complete sparse code is found to be approximately decorrelated, extracting a larger number of approximately decorrelated features from the same-sized input, allowing it to efficiently increase memory capacity beyond that possible for any complete code: a 2.25 times over-complete sparse code is shown to at least double memory capacity compared with a complete sparse code using the same input. This is used in sequential learning to store a potentially large number of optimal control tasks in the network, while catastrophic forgetting is avoided using a partitioned representation, yielding a cost-to-go function approximator that generalizes over the states in each partition. Sparse code advantages over dense codes and local codes are also discussed. | en |
dc.language | en | en |
dc.publisher | Elsevier BV | en |
dc.relation.ispartof | Neurocomputing | en |
dc.title | A sparse code increases the speed and efficiency of neuro-dynamic programming for optimal control tasks with correlated inputs | en |
dc.type | Journal Article | en |
dc.identifier.doi | 10.1016/j.neucom.2020.10.069 | en |
local.contributor.firstname | Peter N | en |
local.profile.school | School of Science and Technology | en |
local.profile.email | ploxley@une.edu.au | en |
local.output.category | C1 | en |
local.record.place | au | en |
local.record.institution | University of New England | en |
local.publisher.place | Netherlands | en |
local.format.startpage | 1 | en |
local.format.endpage | 13 | en |
local.identifier.scopusid | 85096400116 | en |
local.peerreviewed | Yes | en |
local.identifier.volume | 426 | en |
local.contributor.lastname | Loxley | en |
dc.identifier.staff | une-id:ploxley | en |
local.profile.orcid | 0000-0003-3659-734X | en |
local.profile.role | author | en |
local.identifier.unepublicationid | une:1959.11/30943 | en |
local.date.onlineversion | 2020-11-04 | - |
dc.identifier.academiclevel | Academic | en |
local.title.maintitle | A sparse code increases the speed and efficiency of neuro-dynamic programming for optimal control tasks with correlated inputs | en |
local.output.categorydescription | C1 Refereed Article in a Scholarly Journal | en |
local.search.author | Loxley, Peter N | en |
local.uneassociation | Yes | en |
local.atsiresearch | No | en |
local.sensitive.cultural | No | en |
local.identifier.wosid | 000606709400001 | en |
local.year.available | 2020 | en |
local.year.published | 2021 | en |
local.fileurl.closedpublished | https://rune.une.edu.au/web/retrieve/01c18ee5-0e4c-45d7-9e6b-e06bf9dad459 | en |
local.subject.for2020 | 460209 Planning and decision making | en |
local.subject.for2020 | 461105 Reinforcement learning | en |
local.subject.for2020 | 461301 Coding, information theory and compression | en |
local.subject.seo2020 | 280115 Expanding knowledge in the information and computing sciences | en |
local.subject.seo2020 | 280102 Expanding knowledge in the biological sciences | en |
Appears in Collections: | Journal Article School of Science and Technology |
Files in This Item:
File | Size | Format |
---|
SCOPUSTM
Citations
1
checked on Aug 17, 2024
Page view(s)
1,020
checked on Apr 2, 2023
Download(s)
2
checked on Apr 2, 2023
Items in Research UNE are protected by copyright, with all rights reserved, unless otherwise indicated.