|
EDA365欢迎您登录!
您需要 登录 才可以下载或查看,没有帐号?注册
x
Contents l7 d' Q- w. a# N6 L% q9 |
Preface xv) c9 } C& n# r a! F
Acknowledgments xvii+ u. f3 q$ Q7 p. }: n
Chapter 1 Probability concepts 1
. V6 u5 h- O1 w ?( C1.1 Introduction 1
3 a/ a. {9 m, u" A, T5 @ z1.2 Sets and Probability 10 ?0 P/ `3 `! R9 X ^
1.2.1 Basic Definitions 1
% w o6 A1 ~0 \5 ]* |- O( e; a1.2.2 Venn Diagrams and Some Laws 3
: s& i4 {8 w: u0 Y1.2.3 Basic Notions of Probability 6
" f/ @& K3 W" k1.2.4 Some Methods of Counting 8
. Y, f/ ~. x4 k! A- }* a3 t' `/ @1.2.5 Properties, Conditional Probability, and Bayes’ Rule 12
0 T5 q( \$ x" t; H1.3 Random Variables 17
* ^9 W0 Y7 n; A% z- o) \1.3.1 Step and Impulse Functions 17/ P& H1 a; m0 O" u
1.3.2 Discrete Random Variables 18* Q3 {+ ^5 w2 n: d0 ]" o, l$ r5 r! L
1.3.3 Continuous Random Variables 20
; l$ R( ~9 X8 M8 L1.3.4 Mixed Random Variables 22
4 v# U5 e8 g# j& _1.4 Moments 23' Y2 f/ M T2 p
1.4.1 Expectations 23$ f0 r \4 b: k# a
1.4.2 Moment Generating Function and Characteristic Function 26
: T- y$ Y8 ]# ` m! Z1.4.3 Upper Bounds on Probabilities and Law of Large
& y3 d' z/ p- Y/ H) ^! mNumbers 29
1 Y: Y0 u0 y4 A7 R0 C, O1.5 Two- and Higher-Dimensional Random Variables 31* F- }7 Q m5 M5 g: r
1.5.1 Conditional Distributions 33- ?) r! N) t; `. j* h- `& v; O
1.5.2 Expectations and Correlations 414 D) v6 U( F& p3 B9 E9 P/ R
1.5.3 Joint Characteristic Functions 44
0 c, r. U( g, z9 B1.6 Transformation of Random Variables 48
: f1 {! P8 q5 J, H1 p+ J1.6.1 Functions of One Random Variable 49
# K$ ]' o# D; p ?' ^1.6.2 Functions of Two Random Variables 52
8 ?, C9 [! V8 _: ]: ]1.6.3 Two Functions of Two Random Variables 59
5 X: l5 Q4 W$ Z1.7 Summary 65
7 e& C+ y% _& r7 [ mProblems 65- C# [1 U! L ?
Reference 73
9 h5 T$ v8 b' SSelected Bibliography 73
, _. l% B; U( X" C; i5 `Chapter 2 Distributions 754 Q/ ]& e3 ~% R3 R
2.1 Introduction 759 I% p: d: u+ n' Y4 k
2.2 Discrete Random Variables 75
7 A' L9 N& a( @/ {& f: u# ?2.2.1 The Bernoulli, Binomial, and Multinomial Distributions 75" u+ h \% }1 U9 g/ A7 } ~
2.2.2 The Geometric and Pascal Distributions 78
$ {# }; B* h4 o6 y5 ?2.2.3 The Hypergeometric Distribution 82
% f; v$ h. s7 w& t$ ~& T0 T4 Y/ @2.2.4 The Poisson Distribution 853 F3 ^ ?( B# n8 j* b- C3 G' d
2.3 Continuous Random Variables 88
1 s/ i) u( t6 V( E% P1 d% ]$ c! T! P2.3.1 The Uniform Distribution 884 }7 y! R. Y5 J, t) G5 Y4 D
2.3.2 The Normal Distribution 894 N Z. R" A# p' B
2.3.3 The Exponential and Laplace Distributions 96
: C6 T o# l& U& v. b9 r' q3 o2.3.4 The Gamma and Beta Distributions 98+ O. s& a: A4 a, i- @, a3 f$ F
2.3.5 The Chi-Square Distribution 1012 j n: K9 X0 W- A9 X
2.3.6 The Rayleigh, Rice, and Maxwell Distributions 106
. [, G: }; P4 d1 _2.3.7 The Nakagami m-Distribution 1157 [, G, {, x. l5 s) C( S; u/ G
2.3.8 The Student’s t- and F-Distributions 1154 b1 {" V2 S: j0 Q0 _
2.3.9 The Cauchy Distribution 120
8 |& i; I2 M5 s) Z& ^0 l/ v* y2.4 Some Special Distributions 121+ E Y# z* F6 `0 c! ?$ t
2.4.1 The Bivariate and Multivariate Gaussian Distributions 1215 @1 r% L7 s! t) B
2.4.2 The Weibull Distribution 1291 h5 j+ N' F% a8 u; E: B
2.4.3 The Log-Normal Distribution 131
8 V+ S8 B5 A; t F& |5 \2.4.4 The K-Distribution 1320 c2 n5 x% e. t- I9 A
2.4.5 The Generalized Compound Distribution 135
. h7 L% g- w* Z1 o9 g+ d2.5 Summary 136
0 [8 H' B- j) b. J/ W _4 ^Problems 137+ J" h. m5 Y# g' s( F) Q6 c
Reference 139& t2 l1 @$ S( I, a9 ~4 S
Selected Bibliography 139
y$ n7 \. O6 x0 VChapter 3 Random Processes 1410 q: K9 p' h: l& _" ~+ @2 s3 P& y
3.1 Introduction and Definitions 141
$ v/ H1 q, L ~/ {4 Y3.2 Expectations 145
0 D1 a5 ]3 l, V3 m3.3 Properties of Correlation Functions 153
/ C/ g6 Q7 G7 \2 C3.3.1 Autocorrelation Function 153
/ n5 M- d+ Q" @3.3.2 Cross-Correlation Function 153% Y4 ] _! b) {+ b3 o
3.3.3 Wide-Sense Stationary 154, S; C0 h9 q9 o. n' i* T1 E
3.4 Some Random Processes 1561 O5 I+ k1 m- }. _: g) r! r" q0 V
3.4.1 A Single Pulse of Known Shape but Random Amplitude* G! m0 S; n$ F( t$ y
and Arrival Time 156
! t8 L3 q% W+ j0 F4 |/ c3.4.2 Multiple Pulses 157
$ A. r* @, D+ B& v3 U3 r3.4.3 Periodic Random Processes 158
* X( e! d# k$ m3.4.4 The Gaussian Process 161
' q6 O9 Y5 w6 ^$ r3.4.5 The Poisson Process 163
4 L) _7 Q6 f' |* X3.4.6 The Bernoulli and Binomial Processes 166
* L$ [! m; r- u: d* Z- ~9 J3.4.7 The Random Walk and Wiener Processes 168/ [4 a$ k7 e; k5 [8 m
3.4.8 The Markov Process 1724 S2 |/ h9 | S; M) A
3.5 Power Spectral Density 174" A m ]! H& D7 }
3.6 Linear Time-Invariant Systems 1781 E+ Q9 S' U, y; g
3.6.1 Stochastic Signals 179: T/ p/ Y+ T! X7 s( C1 \
3.6.2 Systems with Multiple Terminals 1853 \- ~* f8 m; G# } v
3.7 Ergodicity 1861 S' n6 d# ~4 _" F! x& P8 g
3.7.1 Ergodicity in the Mean 186" N: k& r; f9 S0 s$ H+ }
3.7.2 Ergodicity in the Autocorrelation 187 N% r1 F9 { z
3.7.3 Ergodicity of the First-Order Distribution 1884 J& g1 I- ]: s0 a: H
3.7.4 Ergodicity of Power Spectral Density 188+ G* j9 ]* a+ x0 j8 k; O" }
3.8 Sampling Theorem 189
/ O+ o; t# B6 T& \; o3.9 Continuity, Differentiation, and Integration 194% U7 D- Q* p: q1 L4 k4 O8 i) S
3.9.1 Continuity 194
' E, J( W: w" t1 p# R3.9.2 Differentiation 196
+ P- Q1 T7 Z. j$ J3.9.3 Integrals 199
* R+ _; i. K( M/ Z: T3 f% Y3.10 Hilbert Transform and Analytic Signals 201% W' W! K# x& }" ?' G" D
3.11 Thermal Noise 205
0 O* k3 d+ g- R1 c. U& V3 a3.12 Summary 211
1 v7 J. m, H+ ^, d; U9 SProblems 212; a% G! @( q N, N0 N$ \2 Y
Selected Bibliography 221- ]+ g( h, e9 B' h$ [
Chapter 4 Discrete-Time Random Processes 223
1 s- d; e) m( b( ^4.1 Introduction 223 F: C3 v3 Y6 o/ ^3 e s! [0 o
4.2 Matrix and Linear Algebra 224
3 j7 ]& ~/ j0 u) l% I& i4.2.1 Algebraic Matrix Operations 224
9 b, a# B- Z5 t2 G4.2.2 Matrices with Special Forms 232
7 E( n: g# a$ j$ e' B4.2.3 Eigenvalues and Eigenvectors 236
5 \0 j5 k. J1 }0 I- n. `- l. F4.3 Definitions 245- K$ p: s8 A2 L
4.4 AR, MA, and ARMA Random Processes 253
) I4 t2 h- V, Y; [. W7 b4.4.1 AR Processes 2548 R% W# ^ w$ W; \6 q
4.4.2 MA Processes 2624 K% U- F$ p$ Q- M0 ?: j, A1 t
4.4.3 ARMA Processes 264
2 n! \8 W) D7 [$ W4.5 Markov Chains 2661 y' E" j( R4 I9 y7 ?: F M
4.5.1 Discrete-Time Markov Chains 267
$ }' V: V# H5 H; s! B4.5.2 Continuous-Time Markov Chains 2768 F6 Q$ h0 I4 J3 J2 t8 B% a! x- N( ~
4.6 Summary 284+ h1 T2 F% m- l
Problems 284$ {4 x9 s1 a j9 @- k1 \0 r/ W
References 287
}6 ]2 P o* T& V7 k( n) NSelected Bibliography 288: s1 B- O8 e. E2 [% P
Chapter 5 Statistical Decision Theory 289
" B4 i8 n/ a1 r) F. j! a; ?5.1 Introduction 2896 C. F% S2 Y% g+ o9 f d1 D
5.2 Bayes’ Criterion 291
2 S+ z4 R8 Q. \8 I5.2.1 Binary Hypothesis Testing 291 |* t% `8 Y' G/ u/ x$ W/ a n
5.2.2 M-ary Hypothesis Testing 303
A; T+ `( o- i) M5.3 Minimax Criterion 313
2 t1 x1 R4 ?, f* O' w5.4 Neyman-Pearson Criterion 317$ ~" ]' ?& }& C( g
5.5 Composite Hypothesis Testing 326
' D/ u: ], Q) I0 f9 Y* |+ g5.5.1 Θ Random Variable 3273 b& r2 Z' W. o2 s- V/ k
5.5.2 θ Nonrandom and Unknown 3298 d5 C, H5 {% r) k' }( y8 L
5.6 Sequential Detection 332! M Q* ?1 Z. W3 U8 [$ i( p
5.7 Summary 337
! T e# p; T2 _' E' L3 UProblems 338
# [7 f% w$ X! F' M+ o6 rSelected Bibliography 343; D" ]! {2 t$ O; U2 h
Chapter 6 Parameter Estimation 345! |7 ]9 Z u/ h6 g5 S
6.1 Introduction 3451 ]. O [, ~8 k( w- N; |
6.2 Maximum Likelihood Estimation 346
7 o" N: [7 ?( c: Q( X6.3 Generalized Likelihood Ratio Test 348
" z' x3 z. k0 ]6.4 Some Criteria for Good Estimators 353
8 q b8 `2 I9 i6 T( }& F$ Z6.5 Bayes’ Estimation 355
( U, h( [; f( h( h5 |3 x4 B7 ^& K6.5.1 Minimum Mean-Square Error Estimate 3579 o2 z" s7 e! V
6.5.2 Minimum Mean Absolute Value of Error Estimate 358
1 q, {$ Q7 H& y4 m# i* f0 k/ M9 V' K6.5.3 Maximum A Posteriori Estimate 359
! C- I. m; `. `6.6 Cramer-Rao Inequality 364
& C, `3 j% O# c" P* c5 V. x) t6.7 Multiple Parameter Estimation 371
4 t3 P* E8 K0 G! H1 {0 |6.7.1 θ Nonrandom 371
( w D9 T z6 h) |, p/ g2 d6.7.2 θ Random Vector 376
$ \+ P/ ?+ e% z# e& M% J6.8 Best Linear Unbiased Estimator 378
. N- ^0 |* z$ I1 R$ P0 y6.8.1 One Parameter Linear Mean-Square Estimation 379
) v, G# }& E% Q2 D+ L6.8.2 θ Random Vector 381" W8 H8 ?/ C7 d' l$ E1 f
6.8.3 BLUE in White Gaussian Noise 383
. \; ^% b5 i% d6.9 Least-Square Estimation 3880 y; ?- P4 h2 T7 T. H9 L
6.10 Recursive Least-Square Estimator 391
q6 e; i. G( C! O5 s3 M6 M6.11 Summary 393
f- ]7 G: b9 H* f/ z+ Z* ^1 }! T; sProblems 394
/ j7 E& m: k- w8 [References 398
8 N. T( D* I' |4 P x5 ]: I0 {Selected Bibliography 398. b9 R+ P8 n! o: F& l. e3 I
Chapter 7 Filtering 399
4 r, o0 |" q$ v$ ^" v3 X. [7.1 Introduction 399
5 [" y9 S, l) R9 L# N2 t. H5 @! j7.2 Linear Transformation and Orthogonality Principle 400
. Y1 N: s% K; f7.3 Wiener Filters 409
: ~% S6 a+ b% W& u/ B3 N/ C7.3.1 The Optimum Unrealizable Filter 410$ d5 w0 K( g4 ^5 Q' }
7.3.2 The Optimum Realizable Filter 416- \& J: X! |2 t( |$ W+ J
7.4 Discrete Wiener Filters 424
, C( }3 r3 _! e( f* G7.4.1 Unrealizable Filter 425" P& w* s/ T1 |" U: X9 V& c8 I
7.4.2 Realizable Filter 426
! A, z* I7 g# D: U( p1 a1 m; z7.5 Kalman Filter 436
6 R$ J) g) ]$ |# ]+ }9 ^7.5.1 Innovations 437
! C m( n/ Z: R! Y* ]6 u* C7.5.2 Prediction and Filtering 440
/ ~5 e3 X1 t( F2 e0 a+ b1 U7.6 Summary 445
; |: {# w+ [+ k4 z+ R1 f) yProblems 445
& a6 f L! q% y( _* ] T( T. ^References 448) q+ x5 F$ ?7 I: I3 ]& H
Selected Bibliography 448, {* A2 ?9 C( s( C& D W5 r- [
Chapter 8 Representation of Signals 449+ m9 H# W7 s5 ^# i/ U4 {# ~
8.1 Introduction 449
1 P L0 n+ s( n6 U5 l1 p9 T( ]8.2 Orthogonal Functions 449
7 \0 O$ v2 B6 B8.2.1 Generalized Fourier Series 451' Y4 V+ S1 L1 t# ~# M
8.2.2 Gram-Schmidt Orthogonalization Procedure 455/ F7 B X/ [/ x
8.2.3 Geometric Representation 458
; N) y- [9 Q4 m' r6 [, u7 j% s: r$ b8.2.4 Fourier Series 463% b Z0 }& \% u5 J
8.3 Linear Differential Operators and Integral Equations 4662 h2 C% j4 Y" Q% H1 K
8.3.1 Green’s Function 470
7 K/ J( Y6 J% B u" M! j; u6 X8.3.2 Integral Equations 471
# z- S. X# O( U6 C# A8.3.3 Matrix Analogy 479
& `1 u; L# W9 J3 U) x! ] R8.4 Representation of Random Processes 480
$ |+ B% C* G1 f: J' j8.4.1 The Gaussian Process 483
) c# ~2 T2 x- q0 E8.4.2 Rational Power Spectral Densities 487
% X! k# D9 v$ |7 y" E8 `: T4 j8.4.3 The Wiener Process 4929 s/ m& D: _- w$ o5 Q
8.4.4 The White Noise Process 493
* z4 L, E! K4 K$ q1 M x6 c. J8.5 Summary 495
; L! ]; B# i. J0 n2 v; NProblems 4968 p# \3 m0 Y! p* C4 E0 ^8 j' f! n" m
References 5004 E2 H1 _: D- ]8 V4 O
Selected Bibliography 500) a' Q8 V6 U( t( m ^
Chapter 9 The General Gaussian Problem 503/ t, f7 P _% _% k" N1 s- g1 `$ I
9.1 Introduction 503
9 D2 y8 ^% e. l# }# U3 |9.2 Binary Detection 503
) g4 J6 c4 w! d5 R9.3 Same Covariance 505
7 f6 Z& h) t8 Z# `; s9.3.1 Diagonal Covariance Matrix 508
' E3 J2 Y9 s0 P) R$ U& o# O( O9.3.2 Nondiagonal Covariance Matrix 511+ m/ N# n( Z" x# {
9.4 Same Mean 518/ E z/ Z* s: F$ z8 S9 t
9.4.1 Uncorrelated Signal Components and Equal Variances 519
% g% a7 b4 c1 W' [4 P! Y0 N9.4.2 Uncorrelated Signal Components and Unequal
* o" l6 a4 z% ^; V8 lVariances 522
$ c' o7 f( _3 w6 M- o9.5 Same Mean and Symmetric Hypotheses 524$ | G9 K4 a/ y& A! _3 r9 `
9.5.1 Uncorrelated Signal Components and Equal Variances 526
+ i0 U+ F0 |- Q' [' w9.5.2 Uncorrelated Signal Components and Unequal
. ^1 L. o! b: V- f: Y% A5 mVariances 528
& l) K* x; c( Q9.6 Summary 529
9 _- O# a9 q3 q0 B- S" v$ V' `Problems 530
2 C9 r4 L( K1 Y1 d& f# RReference 532. M6 ]* V$ e1 J! ~7 D" P+ @) @! I5 f
Selected Bibliography 532
" ?' P( I" i+ ~" x$ t; {' dChapter 10 Detection and Parameter Estimation 533: S& ~1 c9 X# _
10.1 Introduction 533
9 f1 I4 q) }8 w- l10.2 Binary Detection 534
( R5 K: j# c" f* z10.2.1 Simple Binary Detection 534
, [5 ^1 d) E5 l5 D+ X% O. l10.2.2 General Binary Detection 543
' R' F: y- B3 B4 I h& K, O10.3 M-ary Detection 5569 r3 o5 Q# g4 |; B6 h4 Q4 Y
10.3.1 Correlation Receiver 557! y, N! H4 W( \' s
10.3.2 Matched Filter Receiver 5672 `! E3 m- T4 H# l9 T
10.4 Linear Estimation 5721 g, l% A7 S) A; p0 }
10.4.1 ML Estimation 573
* g1 D4 A* Y! T& N( e( v10.4.2 MAP Estimation 5754 Y3 V) A2 }* |1 {3 w" I
10.5 Nonlinear Estimation 576
5 Y/ C$ a0 ]4 D9 c1 A10.5.1 ML Estimation 576
0 o: }) i1 k* G( i. J0 d10.5.2 MAP Estimation 579
3 Z; _% T# G* \4 e% u10.6 General Binary Detection with Unwanted Parameters 580- C: J9 i2 E$ ~! \
10.6.1 Signals with Random Phase 583
a E, ^. f' ]' c! i: R. K10.6.2 Signals with Random Phase and Amplitude 595
. V, l* ^9 ]) u# R10.6.3 Signals with Random Parameters 598
+ }3 p5 P8 i. L' |8 n. ^, T+ p/ H# v10.7 Binary Detection in Colored Noise 606
% `* z, h9 G) Q4 i; |10.7.1 Karhunen-Loève Expansion Approach 607: Y# \% H& i4 y1 w# k* V0 V5 m
10.7.2 Whitening Approach 611/ y& ~1 L# i3 B3 c# x9 l7 _3 h& z
10.7.3 Detection PeRFormance 615
5 m2 Z4 q$ ^7 _+ d0 A7 r10.8 Summary 617
3 G- N; ^7 E8 s3 l+ s# X2 uProblems 618
& m. r% v! I! g {2 N( P- B1 iReference 626
. K g4 q: a" l$ q* pSelected Bibliography 626/ o; p# m5 U% X
Chapter 11 Adaptive Thresholding CFAR Detection 627
2 \& E$ C5 N+ R$ T0 ~11.1 Introduction 627
1 c& e2 s. t3 V! s, H- X11.2 Radar Elementary Concepts 629
6 G9 c# ?$ {0 y3 N* F! J11.2.1 Range, Range Resolution, and Unambiguous Range 631+ N8 S1 [9 k; c' W" b) R
11.2.2 Doppler Shift 633& T5 i$ t' q: W9 J
11.3 Principles of Adaptive CFAR Detection 634
5 \' n+ O$ B: w6 f, s5 [# A* J U; c11.3.1 Target Models 640
4 y8 N6 ~6 b0 L! Y) W11.3.2 Review of Some CFAR Detectors 642
) L- J3 i( B) ]6 O0 I* S11.4 Adaptive Thresholding in Code Acquisition of Direct-* T& j- j- `* u9 Q, `1 j/ b( [
Sequence Spread Spectrum Signals 648
! }$ Y$ T! L) f# Y' W11.4.1 Pseudonoise or Direct Sequences 649
- x" K- V+ U5 N- a, X11.4.2 Direct-Sequence Spread Spectrum Modulation 652
S! x/ K3 J8 k$ f2 S3 W% d11.4.3 Frequency-Hopped Spread Spectrum Modulation 655$ o8 Q7 q9 ^( h( H# V
11.4.4 Synchronization of Spread Spectrum Systems 655
6 d! Y9 r% f7 T7 P11.4.5 Adaptive Thresholding with False Alarm Constraint 6594 |* w- S2 _* ?4 _" b0 h% u6 T
11.5 Summary 660$ w& J$ z6 l" M
References 6611 q% R9 v" ] r
Chapter 12 Distributed CFAR Detection 665
, M9 Z, `8 ~$ V) Z. ^" W- \1 q# \12.1 Introduction 665
' l. X( |/ h/ ^: K t12.2 Distributed CA-CFAR Detection 6666 ]+ T4 p" q; I$ F
12.3 Further Results 670( n q5 s, ~, \2 r/ {$ \
12.4 Summary 671
( v" w7 i0 a/ _; G! S( QReferences 672
6 k& W- R- ^+ ^4 `1 e- `& RAppendix 675
! [7 ]4 E& u3 `: z( _ GAbout the Author 683
5 }2 ?2 @: k8 Z: ?: JIndex 6852 r& g6 Y2 O1 M6 o8 Y
|
|