Language:
English
繁體中文
Help
圖資館首頁
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Graph-Based Sparse Learning: Models,...
~
Arizona State University.
Graph-Based Sparse Learning: Models, Algorithms, and Applications.
Record Type:
Electronic resources : Monograph/item
Title/Author:
Graph-Based Sparse Learning: Models, Algorithms, and Applications.
Author:
Yang, Sen.
Description:
141 p.
Notes:
Source: Dissertation Abstracts International, Volume: 76-04(E), Section: B.
Notes:
Advisers: Jieping Ye; Peter Wonka.
Contained By:
Dissertation Abstracts International76-04B(E).
Subject:
Computer science.
Online resource:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3666234
ISBN:
9781321391732
Graph-Based Sparse Learning: Models, Algorithms, and Applications.
Yang, Sen.
Graph-Based Sparse Learning: Models, Algorithms, and Applications.
- 141 p.
Source: Dissertation Abstracts International, Volume: 76-04(E), Section: B.
Thesis (Ph.D.)--Arizona State University, 2014.
This item must not be sold to any third party vendors.
Sparse learning is a powerful tool to generate models of high-dimensional data with high interpretability, and it has many important applications in areas such as bioinformatics, medical image processing, and computer vision. Recently, the a priori structural information has been shown to be powerful for improving the performance of sparse learning models. A graph is a fundamental way to represent structural information of features. This dissertation focuses on graph-based sparse learning. The first part of this dissertation aims to integrate a graph into sparse learning to improve the performance. Specifically, the problem of feature grouping and selection over a given undirected graph is considered. Three models are proposed along with efficient solvers to achieve simultaneous feature grouping and selection, enhancing estimation accuracy. One major challenge is that it is still computationally challenging to solve large scale graph-based sparse learning problems. An efficient, scalable, and parallel algorithm for one widely used graph-based sparse learning approach, called anisotropic total variation regularization is therefore proposed, by explicitly exploring the structure of a graph. The second part of this dissertation focuses on uncovering the graph structure from the data. Two issues in graphical modeling are considered. One is the joint estimation of multiple graphical models using a fused lasso penalty and the other is the estimation of hierarchical graphical models. The key technical contribution is to establish the necessary and sufficient condition for the graphs to be decomposable. Based on this key property, a simple screening rule is presented, which reduces the size of the optimization problem, dramatically reducing the computational cost.
ISBN: 9781321391732Subjects--Topical Terms:
199325
Computer science.
Graph-Based Sparse Learning: Models, Algorithms, and Applications.
LDR
:02752nmm a2200301 4500
001
457743
005
20150805065230.5
008
150916s2014 ||||||||||||||||| ||eng d
020
$a
9781321391732
035
$a
(MiAaPQ)AAI3666234
035
$a
AAI3666234
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Yang, Sen.
$3
708813
245
1 0
$a
Graph-Based Sparse Learning: Models, Algorithms, and Applications.
300
$a
141 p.
500
$a
Source: Dissertation Abstracts International, Volume: 76-04(E), Section: B.
500
$a
Advisers: Jieping Ye; Peter Wonka.
502
$a
Thesis (Ph.D.)--Arizona State University, 2014.
506
$a
This item must not be sold to any third party vendors.
520
$a
Sparse learning is a powerful tool to generate models of high-dimensional data with high interpretability, and it has many important applications in areas such as bioinformatics, medical image processing, and computer vision. Recently, the a priori structural information has been shown to be powerful for improving the performance of sparse learning models. A graph is a fundamental way to represent structural information of features. This dissertation focuses on graph-based sparse learning. The first part of this dissertation aims to integrate a graph into sparse learning to improve the performance. Specifically, the problem of feature grouping and selection over a given undirected graph is considered. Three models are proposed along with efficient solvers to achieve simultaneous feature grouping and selection, enhancing estimation accuracy. One major challenge is that it is still computationally challenging to solve large scale graph-based sparse learning problems. An efficient, scalable, and parallel algorithm for one widely used graph-based sparse learning approach, called anisotropic total variation regularization is therefore proposed, by explicitly exploring the structure of a graph. The second part of this dissertation focuses on uncovering the graph structure from the data. Two issues in graphical modeling are considered. One is the joint estimation of multiple graphical models using a fused lasso penalty and the other is the estimation of hierarchical graphical models. The key technical contribution is to establish the necessary and sufficient condition for the graphs to be decomposable. Based on this key property, a simple screening rule is presented, which reduces the size of the optimization problem, dramatically reducing the computational cost.
590
$a
School code: 0010.
650
4
$a
Computer science.
$3
199325
650
4
$a
Bioinformatics.
$3
194415
650
4
$a
Biomedical engineering.
$3
190330
690
$a
0984
690
$a
0715
690
$a
0541
710
2
$a
Arizona State University.
$b
Computer Science.
$3
708551
773
0
$t
Dissertation Abstracts International
$g
76-04B(E).
790
$a
0010
791
$a
Ph.D.
792
$a
2014
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3666234
based on 0 review(s)
ALL
電子館藏
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
000000108682
電子館藏
1圖書
學位論文
TH 2014
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Multimedia file
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3666234
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login