Please use this identifier to cite or link to this item:
https://hdl.handle.net/1959.11/59965
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Wang, Yi | en |
dc.contributor.author | McCane, Brendan | en |
dc.contributor.author | McNaughton, Neil | en |
dc.contributor.author | Huang, Zhiyi | en |
dc.contributor.author | Shadli, Shabah | en |
dc.contributor.author | Neo, Phoebe | en |
dc.date.accessioned | 2024-05-25T10:23:52Z | - |
dc.date.available | 2024-05-25T10:23:52Z | - |
dc.date.issued | 2019 | - |
dc.identifier.citation | 2019 International Joint Conference on Neural Networks (IJCNN), p. 1-8 | en |
dc.identifier.isbn | 9781728119854 | en |
dc.identifier.isbn | 9781728119861 | en |
dc.identifier.uri | https://hdl.handle.net/1959.11/59965 | - |
dc.description.abstract | <p>In this paper, we propose and implement an EEG-based three-dimensional Convolutional Neural Network architecture, 'AnxietyDecoder', to predict anxious personality and decode its potential biomarkers from the participants. Since Goal-Conflict-Specific-Rhythmicity (GCSR) in the EEG is a sign of an anxiety-related system working, we first propose a two-dimensional Conflict-focused CNN (2-D CNN). It simulates the GCSR extraction process but with the advantages of automatic frequency band selection and functional contrast calculation optimization, thus providing more comprehensive trait anxiety predictions. Then, to generate more targeted hierarchical features from local spatio-temporal scale to global, we propose a three-dimensional Conflict-focused CNN (3-D CNN), which simultaneously integrates information in the temporal and brain-topology-related spatial dimensions. In addition, we embed Layer-wise Relevance Propagation (LRP) into our model to reveal the essential brain areas that are correlated to anxious personality. The experimental results show that the percentage variance accounted for by our three-dimensional Conflict-focused CNN is 33%, which is almost four times higher than the previous theoretically derived GCSR contrast (7%). Meanwhile, it also outperforms the 2-D model (26%) and the t-test difference between the 3-D and 2-D models is significant (t(4) = 5.4962, p = 0.0053). What's more, the reverse engineering results provide an interpretable way to understand the prediction decision-making and participants' anxiety personality. Our proposed AnxietyDecoder not only sets a new benchmark for EEG-based anxiety prediction but also reveals essential EEG components that contribute to the decision-making, and thus sheds some light on the anxiety biomarker research.</p> | en |
dc.language | en | en |
dc.publisher | Institute of Electrical and Electronics Engineers (IEEE) | en |
dc.relation.ispartof | 2019 International Joint Conference on Neural Networks (IJCNN) | en |
dc.title | AnxietyDecoder: An EEG-based Anxiety Predictor using a 3-D Convolutional Neural Network | en |
dc.type | Conference Publication | en |
dc.relation.conference | IJCNN 2019: International Joint Conference on Neural Networks | en |
dc.identifier.doi | 10.1109/IJCNN.2019.8851782 | en |
local.contributor.firstname | Yi | en |
local.contributor.firstname | Brendan | en |
local.contributor.firstname | Neil | en |
local.contributor.firstname | Zhiyi | en |
local.contributor.firstname | Shabah | en |
local.contributor.firstname | Phoebe | en |
local.profile.school | School of Science & Technology | en |
local.profile.email | sshadli@une.edu.au | en |
local.output.category | E1 | en |
local.record.place | au | en |
local.record.institution | University of New England | en |
local.date.conference | 14th - 19th July, 2019 | en |
local.conference.place | Hungary | en |
local.publisher.place | United States of America | en |
local.format.startpage | 1 | en |
local.format.endpage | 8 | en |
local.title.subtitle | An EEG-based Anxiety Predictor using a 3-D Convolutional Neural Network | en |
local.contributor.lastname | Wang | en |
local.contributor.lastname | McCane | en |
local.contributor.lastname | McNaughton | en |
local.contributor.lastname | Huang | en |
local.contributor.lastname | Shadli | en |
local.contributor.lastname | Neo | en |
dc.identifier.staff | une-id:sshadli | en |
local.profile.orcid | 0000-0002-3607-3469 | en |
local.profile.role | author | en |
local.profile.role | author | en |
local.profile.role | author | en |
local.profile.role | author | en |
local.profile.role | author | en |
local.profile.role | author | en |
local.identifier.unepublicationid | une:1959.11/59965 | en |
dc.identifier.academiclevel | Academic | en |
dc.identifier.academiclevel | Academic | en |
dc.identifier.academiclevel | Academic | en |
dc.identifier.academiclevel | Academic | en |
dc.identifier.academiclevel | Academic | en |
dc.identifier.academiclevel | Academic | en |
local.title.maintitle | AnxietyDecoder | en |
local.output.categorydescription | E1 Refereed Scholarly Conference Publication | en |
local.conference.details | IJCNN 2019: International Joint Conference on Neural Networks, Hungary, 14th - 19th July, 2019 | en |
local.search.author | Wang, Yi | en |
local.search.author | McCane, Brendan | en |
local.search.author | McNaughton, Neil | en |
local.search.author | Huang, Zhiyi | en |
local.search.author | Shadli, Shabah | en |
local.search.author | Neo, Phoebe | en |
local.uneassociation | No | en |
dc.date.presented | 2019 | - |
local.atsiresearch | No | en |
local.sensitive.cultural | No | en |
local.year.published | 2019 | en |
local.year.presented | 2019 | en |
local.subject.for2020 | 3209 Neurosciences | en |
local.profile.affiliationtype | External Affiliation | en |
local.profile.affiliationtype | External Affiliation | en |
local.profile.affiliationtype | External Affiliation | en |
local.profile.affiliationtype | External Affiliation | en |
local.profile.affiliationtype | External Affiliation | en |
local.profile.affiliationtype | External Affiliation | en |
local.date.moved | 2024-08-15 | en |
Appears in Collections: | Conference Publication School of Science and Technology |
SCOPUSTM
Citations
6
checked on Jul 6, 2024
Items in Research UNE are protected by copyright, with all rights reserved, unless otherwise indicated.