Please use this identifier to cite or link to this item:
Title: L1 LASSO and its Bayesian Inference
Contributor(s): Gao, Junbin (author); Antolovich, Michael (author); Kwan, Paul Hing  (author)
Publication Date: 2008
Handle Link:
Abstract: A new iterative procedure for solving regression problems with the so-called LASSO penalty is proposed by using generative Bayesian modeling and inference. The algorithm produces the anticipated parsimonious or sparse regression models that generalize well on unseen data. The proposed algorithm is quite robust and there is no need to specify any model hyperparameters. A comparison with state-of-the-art methods for constructing sparse regression models such as the relevance vector machine (RVM) and the local regularization assisted orthogonal least squares regression (LROLS) is given.
Publication Type: Conference Publication
Conference Details: 21st Australasian Joint Conference on Artificial Intelligence, Auckland, New Zealand, December 1-5, 2008
Source of Publication: AI 2008: advances in artificial intelligence : 21st Australasian Joint Conference on Artificial Intelligence, Auckland, New Zealand, December 1-5, 2008, p. 318-324
Publisher: Springer
Place of Publication: Berlin
Field of Research (FOR): 080109 Pattern Recognition and Data Mining
Socio-Economic Objective (SEO): 890299 Computer Software and Services not elsewhere classified
HERDC Category Description: E1 Refereed Scholarly Conference Publication
Other Links:
Series Name: Lecture notes in artificial intelligence
Lecture notes in computer science
Series Number : 5360
Statistics to Oct 2018: Visitors: 197
Views: 201
Downloads: 0
Appears in Collections:Conference Publication

Files in This Item:
2 files
File Description SizeFormat 
Show full item record

Page view(s)

checked on May 3, 2019


checked on May 3, 2019
Google Media

Google ScholarTM


Items in Research UNE are protected by copyright, with all rights reserved, unless otherwise indicated.