| | | 信息理論基礎(英文版工業和信息化部十二五規劃教材) | 該商品所屬分類:教材 -> 研究生/本科/專科教材 | 【市場價】 | 272-395元 | 【優惠價】 | 170-247元 | 【介質】 | book | 【ISBN】 | 9787512419728 | 【折扣說明】 | 一次購物滿999元台幣免運費+贈品 一次購物滿2000元台幣95折+免運費+贈品 一次購物滿3000元台幣92折+免運費+贈品 一次購物滿4000元台幣88折+免運費+贈品
| 【本期贈品】 | ①優質無紡布環保袋,做工棒!②品牌簽字筆 ③品牌手帕紙巾
| |
版本 | 正版全新電子版PDF檔 | 您已选择: | 正版全新 | 溫馨提示:如果有多種選項,請先選擇再點擊加入購物車。*. 電子圖書價格是0.69折,例如了得網價格是100元,電子書pdf的價格則是69元。 *. 購買電子書不支持貨到付款,購買時選擇atm或者超商、PayPal付款。付款後1-24小時內通過郵件傳輸給您。 *. 如果收到的電子書不滿意,可以聯絡我們退款。謝謝。 | | | | 內容介紹 | |
-
出版社:北京航空航天大學
-
ISBN:9787512419728
-
作者:編者:陳傑//孫兵//於澤//周蔭清
-
頁數:153
-
出版日期:2016-01-01
-
印刷日期:2016-01-01
-
包裝:平裝
-
開本:16開
-
版次:1
-
印次:1
-
字數:262千字
-
陳傑、孫兵、於澤、周蔭清編著的《信息理論基 礎(英文版工業和信息化部十二五規劃教材)》以通信 繫統的基本模型為主線,繫統全面地闡述信息理論基 礎課程應包含的知識點。本書包含信息論基本概念和 信息論應用2部分,共11章,第1部分包括緒論、信息 的統計度量、離散信源、無損編碼和數據壓縮、離散 信道及其容量、信道編碼;第2部分包括率失真、連續 信源、連續信道及其容量、最大熵和譜估計、計算機 仿真實驗等。本書根據作者近十年來從事信息論中英 文教學和科研實踐,總結歸納而成,可作為留學生本 科生和研究生教學使用。
-
Chapter 1 Introduction 1.1 Concept of information 1.2 History of information theory 1.3 Information, messages and signals 1.4 Communication system model 1.5 Information theory applications 1.5.1 Electrical engineering (communication theory) 1.5.2 Computer science (algorithmic complexity) Exercises Chapter 2 Statistical Measure of Information 2.1 Information of random events 2.1.1 Self-information 2.1.2 Conditional self-information 2.1.3 Mutual information of events 2.2 Information of discrete random variables 2.2.1 Entropy of discrete random variables 2.2.2 Joint entropy 2.2.3 Conditional entropy 2.2.4 Mutual information of discrete random variables 2.3 Relationship between entropy and mutual information 2.4 Mutual information and entropy of continuous random variables 2.4.1 Mutual information of continuous random variabies 2.4.2 Entropy oI continuous random variables Exercises Chapter 3 Discrete Source and Its Entropy Rate 3.1 Mathematical model of source 3.1.1 Discrete source and continuous source 3.1.2 Simple discrete source and its extension 3.1.3 Memoryless source and source with memory 3.2 Discrete memoryles source 3.2.1 Definition 3.2.2 Extension of discrete source 3.3 Discrete stationary source 3.3.1 Definition 3.3.2 Entropy rate of discrete stationary source 3.4 Discrete Markov source 3.4.1 Markov chain 3.4.2 Transition probability 3.4.3 Markov source and its entropy rate Exercises Chapter 4 Lossless Source Coding and Data Compression 4.1 Asymptotic equipartition property and typical sequences 4.2 Lossless source coding 4.2.1 Encoder 4.2.2 Blockcode 4.2.3 Fixed length code 4.2.4 Variable length code 4.3 Data compression 4.3.1 Shannon coding 4.3.2 Huffman coding 4.3.3 Fano coding Exercises Chapter 5 Discrete Channel and Its Capacity 5.1 Mathematical model of channel 5.2 Discrete memoryless channel 5.2.1 Mathematical model o{ discrete memoryless channel 5.2.2 Simple DMC 5.2.3 Extension of discrete memoryless channel 5.3 Channel combination 5.4 Channel capacity 5.4.1 Concept of channel capacity 5.4.2 Channel capacity of several special discrete channels 5.4.3 Channel capacity of symmetric channels 5.4.4 Channel capacity of extended DMC 5.4.5 Channel capacity of independent parallel DMC 5.4.6 Channel capacity of the sum channel 5.4.7 Channel capacity of general discrete channels Exercises Chapter 6 Noisy-channel Coding 6.1 Probability of error 6.2 Decoding rules 6.3 Channel coding 6.3.1 Simple repetition code 6.3.2 Linear code 6.4 Noisy-channel coding theorem Exercises Chapter 7 Rate Distortion 7.1 Quantization 7.2 Distortion definition 7.2.1 Distortion function 7.2.2 Mean distortion 7.3 Rate distortion function 7.3.1 Fidelity criterion for given channel 7.3.2 Definition of rate distortion function 7.3.3 Property of rate distortion function 7.4 Rate distortion theorem and the converse 7.5 The ea|culation of rate distortion function Exercises Chapter 8 Continuous Source find Its Entropy Rate 8.1 Continuous source 8.2 Entropy of continuous source 8.3 Maximum entropy of continuous source 8.4 Joint entropy, conditional entropy and mutual information for continuous random variables 8.5 Entropy rate of continuous source 8.6 Rate distortion for continuous source Exercises Chapter 9 Continuous Channel and Its Capacity 9.1 Capacity of continuous channel 9.1.1 Capacity of discrete-time channel 9.1.2 Capacity of continuous time channel 9.2 The Gaussian channel 9.3 Band-limited channels 9.4 Coding theorem for continuous channel Exercises Chapter 10 Maximum Entropy and Spectrum Estimation 10.1 Maximum entropy probability distribution 10.1.1 Maximum entropy distribution 10.1.2 Examples 10.2 Maximum entropy spectrum estimation 10.2.1 Burg's max entropy theorem 10.2.2 Maximum entropy spectrum estimation Exercises Chapter 11 Experiments of Information Theory 11.1 Measure of information 11.1.1 Information calculator 11.1.2 Properties of entropy 11.2 Simulation of Markov source 11.3 Performance simulation for source coding 11.3.1 Shannon coding 11.3.2 Huffman coding 11.3.3 Fano coding 11.4 Simulation of BSC 11.5 Simulation of the cascade channel 11.6 Calculation of channel capacity 11.7 Decoding rules 11.8 Performance demonstration of channel coding References
| | | | | |