語系:
繁體中文
English
說明(常見問題)
圖資館首頁
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Deep learning through sparse and low...
~
Fu, Yun,
Deep learning through sparse and low-rank modeling
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Deep learning through sparse and low-rank modelingedited by Zhangyang Wang, Yun Fu, Thomas S. Huang.
其他作者:
Wang, Zhangyang,
出版者:
London, United Kingdom :Academic Press, an imprint of Elsevier,2019.
面頁冊數:
1 online resource.
標題:
Machine learning.
電子資源:
https://www.sciencedirect.com/science/book/9780128136591
ISBN:
9780128136607 (electronic bk.)
Deep learning through sparse and low-rank modeling
Deep learning through sparse and low-rank modeling
[electronic resource] /edited by Zhangyang Wang, Yun Fu, Thomas S. Huang. - London, United Kingdom :Academic Press, an imprint of Elsevier,2019. - 1 online resource. - Computer vision and pattern recognition series. - Computer vision and pattern recognition series..
Includes bibliographical references and index.
Front Cover; Deep Learning Through Sparse and Low-Rank Modeling; Copyright; Contents; Contributors; About the Editors; Preface; Acknowledgments; 1 Introduction; 1.1 Basics of Deep Learning; 1.2 Basics of Sparsity and Low-Rankness; 1.3 Connecting Deep Learning to Sparsity and Low-Rankness; 1.4 Organization; References; 2 Bi-Level Sparse Coding: A Hyperspectral Image Classi cation Example; 2.1 Introduction; 2.2 Formulation and Algorithm; 2.2.1 Notations; 2.2.2 Joint Feature Extraction and Classi cation; 2.2.2.1 Sparse Coding for Feature Extraction
Deep Learning through Sparse Representation and Low-Rank Modeling bridges classical sparse and low rank models-those that emphasize problem-specific Interpretability-with recent deep network models that have enabled a larger learning capacity and better utilization of Big Data. It shows how the toolkit of deep learning is closely tied with the sparse/low rank methods and algorithms, providing a rich variety of theoretical and analytic tools to guide the design and interpretation of deep learning models. The development of the theory and models is supported by a wide variety of applications in computer vision, machine learning, signal processing, and data mining. This book will be highly useful for researchers, graduate students and practitioners working in the fields of computer vision, machine learning, signal processing, optimization and statistics.
ISBN: 9780128136607 (electronic bk.)Subjects--Topical Terms:
188639
Machine learning.
Index Terms--Genre/Form:
214472
Electronic books.
LC Class. No.: Q325.5
Dewey Class. No.: 006.31
Deep learning through sparse and low-rank modeling
LDR
:04833cmm a2200337 a 4500
001
582236
006
o d
007
cnu|unuuu||
008
210121s2019 enk ob 001 0 eng d
020
$a
9780128136607 (electronic bk.)
020
$a
012813660X (electronic bk.)
020
$a
9780128136591
020
$a
0128136596
035
$a
(OCoLC)1097183504
035
$a
EL2020156
040
$a
N
$b
eng
$c
N
$d
N
$d
EBLCP
$d
OPELS
$d
UKAHL
$d
OCLCF
$d
UKMGB
$d
YDX
$d
OCLCQ
$d
UMI
$d
OCLCQ
041
0
$a
eng
050
4
$a
Q325.5
082
0 4
$a
006.31
$2
23
245
0 0
$a
Deep learning through sparse and low-rank modeling
$h
[electronic resource] /
$c
edited by Zhangyang Wang, Yun Fu, Thomas S. Huang.
260
$a
London, United Kingdom :
$b
Academic Press, an imprint of Elsevier,
$c
2019.
300
$a
1 online resource.
490
1
$a
Computer vision and pattern recognition series
504
$a
Includes bibliographical references and index.
505
0
$a
Front Cover; Deep Learning Through Sparse and Low-Rank Modeling; Copyright; Contents; Contributors; About the Editors; Preface; Acknowledgments; 1 Introduction; 1.1 Basics of Deep Learning; 1.2 Basics of Sparsity and Low-Rankness; 1.3 Connecting Deep Learning to Sparsity and Low-Rankness; 1.4 Organization; References; 2 Bi-Level Sparse Coding: A Hyperspectral Image Classi cation Example; 2.1 Introduction; 2.2 Formulation and Algorithm; 2.2.1 Notations; 2.2.2 Joint Feature Extraction and Classi cation; 2.2.2.1 Sparse Coding for Feature Extraction
505
8
$a
2.2.2.2 Task-Driven Functions for Classi cation2.2.2.3 Spatial Laplacian Regularization; 2.2.3 Bi-level Optimization Formulation; 2.2.4 Algorithm; 2.2.4.1 Stochastic Gradient Descent; 2.2.4.2 Sparse Reconstruction; 2.3 Experiments; 2.3.1 Classi cation Performance on AVIRIS Indiana Pines Data; 2.3.2 Classi cation Performance on AVIRIS Salinas Data; 2.3.3 Classi cation Performance on University of Pavia Data; 2.4 Conclusion; 2.5 Appendix; References; 3 Deep l0 Encoders: A Model Unfolding Example; 3.1 Introduction; 3.2 Related Work; 3.2.1 l0- and l1-Based Sparse Approximations
505
8
$a
3.2.2 Network Implementation of l1-Approximation3.3 Deep l0 Encoders; 3.3.1 Deep l0-Regularized Encoder; 3.3.2 Deep M-Sparse l0 Encoder; 3.3.3 Theoretical Properties; 3.4 Task-Driven Optimization; 3.5 Experiment; 3.5.1 Implementation; 3.5.2 Simulation on l0 Sparse Approximation; 3.5.3 Applications on Classi cation; 3.5.4 Applications on Clustering; 3.6 Conclusions and Discussions on Theoretical Properties; References; 4 Single Image Super-Resolution: From Sparse Coding to Deep Learning; 4.1 Robust Single Image Super-Resolution via Deep Networks with Sparse Prior; 4.1.1 Introduction
505
8
$a
4.1.2 Related Work4.1.3 Sparse Coding Based Network for Image SR; 4.1.3.1 Image SR Using Sparse Coding; 4.1.3.2 Network Implementation of Sparse Coding; 4.1.3.3 Network Architecture of SCN; 4.1.3.4 Advantages over Previous Models; 4.1.4 Network Cascade for Scalable SR; 4.1.4.1 Network Cascade for SR of a Fixed Scaling Factor; 4.1.4.2 Network Cascade for Scalable SR; 4.1.4.3 Training Cascade of Networks; 4.1.5 Robust SR for Real Scenarios; 4.1.5.1 Data-Driven SR by Fine-Tuning; 4.1.5.2 Iterative SR with Regularization; Blurry Image Upscaling; Noisy Image Upscaling; 4.1.6 Implementation Details
505
8
$a
4.1.7 Experiments4.1.7.1 Algorithm Analysis; 4.1.7.2 Comparison with State-of-the-Art; 4.1.7.3 Robustness to Real SR Scenarios; Data-Driven SR by Fine-Tuning; Regularized Iterative SR; 4.1.8 Subjective Evaluation; 4.1.9 Conclusion and Future Work; 4.2 Learning a Mixture of Deep Networks for Single Image Super-Resolution; 4.2.1 Introduction; 4.2.2 The Proposed Method; 4.2.3 Implementation Details; 4.2.4 Experimental Results; 4.2.4.1 Network Architecture Analysis; 4.2.4.2 Comparison with State-of-the-Art; 4.2.4.3 Runtime Analysis; 4.2.5 Conclusion and Future Work; References
520
$a
Deep Learning through Sparse Representation and Low-Rank Modeling bridges classical sparse and low rank models-those that emphasize problem-specific Interpretability-with recent deep network models that have enabled a larger learning capacity and better utilization of Big Data. It shows how the toolkit of deep learning is closely tied with the sparse/low rank methods and algorithms, providing a rich variety of theoretical and analytic tools to guide the design and interpretation of deep learning models. The development of the theory and models is supported by a wide variety of applications in computer vision, machine learning, signal processing, and data mining. This book will be highly useful for researchers, graduate students and practitioners working in the fields of computer vision, machine learning, signal processing, optimization and statistics.
588
0
$a
Vendor-supplied metadata.
650
0
$a
Machine learning.
$3
188639
655
4
$a
Electronic books.
$2
local.
$3
214472
700
1
$a
Wang, Zhangyang,
$e
editor.
$3
872363
700
1
$a
Fu, Yun,
$e
editor.
$3
872364
700
1
$a
Huang, Thomas S.,
$d
1936-
$3
346132
830
0
$a
Computer vision and pattern recognition series.
$3
845682
856
4 0
$u
https://www.sciencedirect.com/science/book/9780128136591
筆 0 讀者評論
全部
電子館藏
館藏
1 筆 • 頁數 1 •
1
條碼號
館藏地
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
000000187079
電子館藏
1圖書
電子書
EB Q325.5 2019
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
多媒體檔案
https://www.sciencedirect.com/science/book/9780128136591
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼
登入