Author(s) |
Gao, Junbin
Antolovich, Michael
Kwan, Paul Hing
|
Publication Date |
2008
|
Abstract |
A new iterative procedure for solving regression problems with the so-called LASSO penalty is proposed by using generative Bayesian modeling and inference. The algorithm produces the anticipated parsimonious or sparse regression models that generalize well on unseen data. The proposed algorithm is quite robust and there is no need to specify any model hyperparameters. A comparison with state-of-the-art methods for constructing sparse regression models such as the relevance vector machine (RVM) and the local regularization assisted orthogonal least squares regression (LROLS) is given.
|
Citation |
AI 2008: advances in artificial intelligence : 21st Australasian Joint Conference on Artificial Intelligence, Auckland, New Zealand, December 1-5, 2008, p. 318-324
|
ISBN |
978-3-540-89377-6
|
Link | |
Publisher |
Springer
|
Series |
Lecture notes in artificial intelligence
Lecture notes in computer science
|
Edition |
1
|
Title |
L1 LASSO and its Bayesian Inference
|
Type of document |
Conference Publication
|
Entity Type |
Publication
|
Name | Size | format | Description | Link |
---|