Title |
L1 LASSO and its Bayesian Inference |
|
|
Publication Date |
|
Author(s) |
|
Editor |
Editor(s): W. Wobcke and M. Zhang |
|
|
Type of document |
|
Language |
|
Entity Type |
|
Publisher |
|
Place of publication |
|
Edition |
|
Series |
Lecture notes in artificial intelligence |
|
Lecture notes in computer science |
|
|
UNE publication id |
|
Abstract |
A new iterative procedure for solving regression problems with the so-called LASSO penalty is proposed by using generative Bayesian modeling and inference. The algorithm produces the anticipated parsimonious or sparse regression models that generalize well on unseen data. The proposed algorithm is quite robust and there is no need to specify any model hyperparameters. A comparison with state-of-the-art methods for constructing sparse regression models such as the relevance vector machine (RVM) and the local regularization assisted orthogonal least squares regression (LROLS) is given. |
|
|
Link |
|
Citation |
AI 2008: advances in artificial intelligence : 21st Australasian Joint Conference on Artificial Intelligence, Auckland, New Zealand, December 1-5, 2008, p. 318-324 |
|
|
ISBN |
|
Start page |
|
End page |
|