語系:
繁體中文
English
說明(常見問題)
圖資館首頁
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Theory of information and its value
~
Belavkin, Roman V.
Theory of information and its value
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Theory of information and its valueby Ruslan L. Stratonovich ; edited by Roman V. Belavkin, Panos M. Pardalos, Jose C. Principe.
作者:
Stratonovich, Ruslan L.
其他作者:
Belavkin, Roman V.
出版者:
Cham :Springer International Publishing :2020.
面頁冊數:
xxii, 419 p. :ill. (some col.), digital ;24 cm.
Contained By:
Springer eBooks
標題:
Information theory.
電子資源:
https://doi.org/10.1007/978-3-030-22833-0
ISBN:
9783030228330$q(electronic bk.)
Theory of information and its value
Stratonovich, Ruslan L.
Theory of information and its value
[electronic resource] /by Ruslan L. Stratonovich ; edited by Roman V. Belavkin, Panos M. Pardalos, Jose C. Principe. - Cham :Springer International Publishing :2020. - xxii, 419 p. :ill. (some col.), digital ;24 cm.
Foreword -- Preface -- 1 Definition of information and entropy in the absence of noise- 2 Encoding of discrete information in the absence of noise and penalties -- 3 Encoding in the presence of penalties. The first variational problem- 4 The first asymptotic theorem and relative results -- 5 Computation of entropy for special cases. Entropy of stochastic processes -- 6 Information in the presence of noise. The Shannon's amount of information -- 7 Message transmission in the presence of noise. The second asymptotic theorem and its various formulations -- 8 Channel capacity. Important particular cases of channels -- 9 Definition of the value of information -- 10 The value of Shannon information for the most important Bayesian systems -- 11 Asymptotical results related to the value of information. The Third asymptotic theorem -- 12 Information theory and the second law of thermodynamics -- Appendix Some matrix (operator) identities -- Index.
This English version of Ruslan L. Stratonovich's Theory of Information (1975) builds on theory and provides methods, techniques, and concepts toward utilizing critical applications. Unifying theories of information, optimization, and statistical physics, the value of information theory has gained recognition in data science, machine learning, and artificial intelligence. With the emergence of a data-driven economy, progress in machine learning, artificial intelligence algorithms, and increased computational resources, the need for comprehending information is essential. This book is even more relevant today than when it was first published in 1975. It extends the classic work of R.L. Stratonovich, one of the original developers of the symmetrized version of stochastic calculus and filtering theory, to name just two topics. Each chapter begins with basic, fundamental ideas, supported by clear examples; the material then advances to great detail and depth. The reader is not required to be familiar with the more difficult and specific material. Rather, the treasure trove of examples of stochastic processes and problems makes this book accessible to a wide readership of researchers, postgraduates, and undergraduate students in mathematics, engineering, physics and computer science who are specializing in information theory, data analysis, or machine learning.
ISBN: 9783030228330$q(electronic bk.)
Standard No.: 10.1007/978-3-030-22833-0doiSubjects--Topical Terms:
183013
Information theory.
LC Class. No.: Q360 / .S773 2020
Dewey Class. No.: 003.54
Theory of information and its value
LDR
:03364nmm a2200325 a 4500
001
573692
003
DE-He213
005
20200624143628.0
006
m d
007
cr nn 008maaau
008
200928s2020 sz s 0 eng d
020
$a
9783030228330$q(electronic bk.)
020
$a
9783030228323$q(paper)
024
7
$a
10.1007/978-3-030-22833-0
$2
doi
035
$a
978-3-030-22833-0
040
$a
GP
$c
GP
041
0
$a
eng
050
4
$a
Q360
$b
.S773 2020
072
7
$a
PBW
$2
bicssc
072
7
$a
MAT003000
$2
bisacsh
072
7
$a
PBW
$2
thema
082
0 4
$a
003.54
$2
23
090
$a
Q360
$b
.S899 2020
100
1
$a
Stratonovich, Ruslan L.
$3
861050
245
1 0
$a
Theory of information and its value
$h
[electronic resource] /
$c
by Ruslan L. Stratonovich ; edited by Roman V. Belavkin, Panos M. Pardalos, Jose C. Principe.
260
$a
Cham :
$b
Springer International Publishing :
$b
Imprint: Springer,
$c
2020.
300
$a
xxii, 419 p. :
$b
ill. (some col.), digital ;
$c
24 cm.
505
0
$a
Foreword -- Preface -- 1 Definition of information and entropy in the absence of noise- 2 Encoding of discrete information in the absence of noise and penalties -- 3 Encoding in the presence of penalties. The first variational problem- 4 The first asymptotic theorem and relative results -- 5 Computation of entropy for special cases. Entropy of stochastic processes -- 6 Information in the presence of noise. The Shannon's amount of information -- 7 Message transmission in the presence of noise. The second asymptotic theorem and its various formulations -- 8 Channel capacity. Important particular cases of channels -- 9 Definition of the value of information -- 10 The value of Shannon information for the most important Bayesian systems -- 11 Asymptotical results related to the value of information. The Third asymptotic theorem -- 12 Information theory and the second law of thermodynamics -- Appendix Some matrix (operator) identities -- Index.
520
$a
This English version of Ruslan L. Stratonovich's Theory of Information (1975) builds on theory and provides methods, techniques, and concepts toward utilizing critical applications. Unifying theories of information, optimization, and statistical physics, the value of information theory has gained recognition in data science, machine learning, and artificial intelligence. With the emergence of a data-driven economy, progress in machine learning, artificial intelligence algorithms, and increased computational resources, the need for comprehending information is essential. This book is even more relevant today than when it was first published in 1975. It extends the classic work of R.L. Stratonovich, one of the original developers of the symmetrized version of stochastic calculus and filtering theory, to name just two topics. Each chapter begins with basic, fundamental ideas, supported by clear examples; the material then advances to great detail and depth. The reader is not required to be familiar with the more difficult and specific material. Rather, the treasure trove of examples of stochastic processes and problems makes this book accessible to a wide readership of researchers, postgraduates, and undergraduate students in mathematics, engineering, physics and computer science who are specializing in information theory, data analysis, or machine learning.
650
0
$a
Information theory.
$3
183013
650
0
$a
Quantum computing.
$3
725269
650
1 4
$a
Information and Communication, Circuits.
$3
276027
650
2 4
$a
Data Structures and Information Theory.
$3
825714
650
2 4
$a
Statistical Physics and Dynamical Systems.
$3
760415
650
2 4
$a
Quantum Information Technology, Spintronics.
$3
379903
650
2 4
$a
Optimization.
$3
274084
700
1
$a
Belavkin, Roman V.
$3
861051
700
1
$a
Pardalos, Panos M.
$3
275700
700
1
$a
Principe, Jose C.
$3
357086
710
2
$a
SpringerLink (Online service)
$3
273601
773
0
$t
Springer eBooks
856
4 0
$u
https://doi.org/10.1007/978-3-030-22833-0
950
$a
Mathematics and Statistics (Springer-11649)
筆 0 讀者評論
全部
電子館藏
館藏
1 筆 • 頁數 1 •
1
條碼號
館藏地
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
000000180052
電子館藏
1圖書
電子書
EB Q360 .S899 2020 2020
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
多媒體檔案
https://doi.org/10.1007/978-3-030-22833-0
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼
登入