通信原理(英文版)

上传人:san****019 文档编号:15871109 上传时间:2020-09-11 格式:PPT 页数:64 大小:1.68MB
收藏 版权申诉 举报 下载
通信原理(英文版)_第1页
第1页 / 共64页
通信原理(英文版)_第2页
第2页 / 共64页
通信原理(英文版)_第3页
第3页 / 共64页
资源描述:

《通信原理(英文版)》由会员分享,可在线阅读,更多相关《通信原理(英文版)(64页珍藏版)》请在装配图网上搜索。

1、1,Chapter 2 Signals,2.1 Classification of Signals 2.1.1 Deterministic signals and random signals What is deterministic signal? What is random signal? 2.1.2 Energy signals and power signals Signal power: Let R = 1, then P = V2/R = I2R = V2 = I2 Signal energy:Let S represent V or I,if S varies with ti

2、me,then S can be rewritten as s(t), Hence, the signal energy E = s2(t)dt Energy signal satisfies Average power: , then P = 0 for energy signal. For power signal: P 0, i.e., power signal has infinite duration. Energy signal has finite energy, but its average power equals 0. Power signal has finite av

3、erage power, but its energy equals infinity.,2,2.2 Characteristics of deterministic signals,2.2.1 Characteristics in frequency domain Frequency spectrum of power signal: let s(t) be a periodic power signal, its period is T0, then we have where 0 = 2 / T0 = 2f0 C(jn0) is a complex function, C(jn0) =

4、|Cn|ejn where |Cn| amplitude of the component with frequency nf0 n phase of the component with frequency nf0 Fourier series of signal s(t):,3,【Example 2.1】 Find the spectrum of a periodic rectangular wave. Solution: Assume the period of a periodic rectangular wave is T , the width is , and the ampli

5、tude isV, then Its frequency spectrum is,4,Frequency spectrum figure,5,【Example 2.2】Find the frequency spectrum of a sinusoidal wave after full-wave rectification. Solution:Assume the expression of the signal is Its frequency spectrum: The Fourier series of the signal is:,6,Frequency spectral densit

6、y of energy signals Let an energy signal be s(t), then its frequency spectral density is The inverse Fourier transform of S() is the original signal: 【Example 2.3】Find the frequency spectral density of a rectangular pulse. Solution: Let the expression of the rectangular pulse be Then its frequency s

7、pectral density is its Fourier transform:,7,【Example 2.4】Find the waveform and the frequency spectral density of a sample function. Solution: The definition of the sample function is the frequency spectral density Sa(t) is: From the above equation, we see that Sa() is a gate function. 【Example 2.5】F

8、ind the unit impulse function and its frequency spectral density. Solution: Unit impulse function is usually called d function d(t). Its definition is The frequency spectral density of (t):,8,d(t) and its frequency spectral density: Physical meaning of function: It is a pulse with infinite height, i

9、nfinitesimal width, and unit area. Sa(t) has the following property: When k , amplitude , and the zero-spacing of the waveform 0, Hence,9,Characterisitics of (t) (t) is an even function: (t) is the derivative of unit step function: Difference between frequency spectral density S(f) of energy signal

10、and frequency spectrum of periodic power signal: S(f ) continuous spectrum; C(jn0) discrete Unit of S(f ): V/Hz; Unit of C(jn0): V Amplitude of S(f ) at a frequency point infinitesimal,10,【Example 2.6】Find the frequency spectral density of a cosinusoidal wave with infinite length. Solution: Let the

11、expression of a cosinusoidal wave be f (t) = cos0t, then according to eq. (2.2-10), F() can be written as Referencing eq.(2.2-19), the above equation can be written as: Introducing (t), the concept of frequency spectral density can be generalized to power signal.,11,Energy spectral density Let the e

12、nergy of an energy signal s(t) be E, then the energy of the signal is decided by If its frequency spectral density is S(f ), then from Parsevals theorem we have where |S(f )|2 is called energy spectral density. The above equation can be rewritten as: where G(f )|S(f)|2 (J / Hz) is energy spectral de

13、nsity. Property of G(f ): Since s(t) is a real function, |S(f )|2 is an even function,12,Power spectral density Let the truncated signal of s(t) is sT(t),-T/2 t T/2, then To define the power spectral density of the signal as: obtain the signal power:,13,2.2.2 Characteristics in time domain Autocorre

14、lation function Definition of the autocorrelation function for energy signal: Definition of the autocorrelation function for power signal: Characteristics: R() is only dependent on , but independent of t. When = 0, R() of energy signal equals the energy of the signal, and R() of power signal equals

15、the average power of the signal.,14,Cross-correlation function Definition of the cross-correlation function for energy signal: Definition of the cross-correlation function for power signal: Characteristics: 1. R12() is dependent on , and independent of t. 2. Proof: Let x = t + ,then,15,2.3 Character

16、istics of random signals,2.3.1 Probability distribution of random variable Concept of random variable: If the random outcome of a trial A is expressed by X, then we call X a random variable, and let its value be x. For example, the number of calls received within a given period of time at the teleph

17、one exchange is a random variable. Distribution function of random variable Definition: FX(x) = P(X x) Characteristics: P(a X b) + P(X a) = P(X b), P(a X b) = P(X b) P(X a), P(a X b) = FX(b) FX(a),16,Distribution function of discrete random variable: Let the values of X be: x1 x2 xi xn,their probabi

18、lities are respectively p1, p2, , pi, , pn, then P (X x1) = 0,P(X xn) = 1 P(X xi) = P(X = x1) + P(X = x2) + + P(X = xi), Characteristics: FX(- ) = 0 FX(+) = 1 If x1 x2, then FX(x1) FX(x2) - monotonic increasing function.,17,Distribution function of continuous random variable: When x is continuous, f

19、rom the definition of distribution function FX(x) = P(X x) we know that FX(x) is a continuous monotonic increasing function.,18,2.3.2 Probability density of random variable Probability density of continuous random variable pX (x) Definition of pX (x): Meaning of pX (x): pX (x) is the derivative of F

20、X (x), and is the slope of the curve of FX (x) P(a X b) can be found from pX (x): Characteristics of pX (x):,pX(x) 0,19,Probability density of discrete random variable Distribution function of discrete random variable can be written as: where pi probability of x = xi u(x) unit step function Finding

21、the derivatives of the two sides of the above equation, we obtain its probability density: Characteristics: When x xi , px (x) = 0 When x = xi , px (x) = ,20,2.4 Examples of frequently used random variables,Random variable with normal distribution Definition: Probability density where 0, a = const.

22、Probability density curve:,21,Random variable with uniform distribution Definition: probability density where a, b are constants. Probability density curve:,22,Random variable with Rayleigh distribution Definition: Probability density where a 0, and is a constant. Probability density curve:,23,2.5 N

23、umerical characteristics of random variable,2.5.1 Mathematical expectation Definition: for continuouse random variable Characteristics: If X and Y are independent of each other, and E(X) and E(Y) exist, then,24,2.5.2 Variance Definition: where Variance can be rewritten as: Proof: For discrete variab

24、le: For continuous variable: Characteristics: D( C ) = 0 D(X+C)=D(X),D(CX)=C2D(X) D(X+Y)=D(X)+D(Y) D(X1 + X2 + + Xn)=D(X1) + D(X2) + + D(Xn),25,2.5.3 Moment Definition: the k-th moment of a random variable X is k-th origin moment is the moment when a = 0: k-th central moment is the moment when : Cha

25、racteristics: The first origin moment is the mathematical expectation: The second central moment is the variance:,26,2.6 Random process,2.6.1 Basic concept of random process X(A, t) ensumble consisting of all possible “realizations” of an event A X(Ai , t) a realization of event A, it is a determine

26、d time function X(A, tk) value of the function at the given time tk Denote for short: X(A, t) X(t) X(Ai , t) Xi (t),27,Example: receiver noise Numerical characteristics of random process: Statistical mean: Variance: Autocorrelation function:,28,2.6.2 Stationary random process Definition of stationar

27、y random process: A random process whose statistical characteristics is independent of the time origin is called a stationary random process. (or, strict stationary random process) Definition of generalized stationary random process: The random process whose mean, variance and autocorrelation functi

28、on are independent of the time origin Characteristics of generalized stationary random process: A strict stationary random process must be a generalized stationary random process; but a generalized stationary random process is not always a strict stationary random process.,29,2.6.3 Ergodicity Signif

29、icance of ergodicity A realization of a stationary random process can go through all states of the process. Characteristic of ergodicity: time average may be replaed by statistical mean. For example, Statistical mean of ergodic process mX: Autocorrelation function of ergodic process RX(): If a rando

30、m process has ergodicity, then it must be a strict stationary random process. However, a strict stationary random process is not always ergodic.,30,Ergodicity of stationary communication system If the signal and the noise are both ergodic, then First origin moment mX = EX(t) D. C. component of signa

31、l Square of first origin moment mX 2 power of normalized D.C. component of signal Second origin moment E X 2( t ) normalized average power of signal Square root of second origin moment E X 2(t)1/2 root mean square of signal current or voltage Second central moment X2 normalized average power fo A. C

32、. component of signal If mX = mX 2 = 0, thenX2 = E X 2( t ) ; Standard deviation X root mean square of A. C. component of signal If mX = 0, then X is root mean square of signal,31,2.6.4 Autocorrelation function and power spectral density of stationary random process Characteristics of autocorrelatio

33、n function,32,Characteristics of power spectral density Review: power spectral density of deterministic signal Similarly, power spectral density of stationary random process equals: Average power:,33,Relationship between autocorrelation function and the number k of its sign changes in time interval

34、T obeys Poisson distribution: where, is average number of sign changes of amplitude in unit time. Find its autocorrelation function R() and power spectral density P(f).,35,Solution: It can be seen from the above figure, x(t)x(t-) has only two possible values: a2 or -a2. Hence, equation can be reduce

35、d to R() = a2 occurrence probability of a2 + (-a2) occurrence probability of (-a2) where, the occurrence probability can be calculated according to Poisson distribution P(k). If the number of sign changes of x(t) is even in second, then +a2 occurs; if the number of sign changes of x(t) is odd in sec

36、ond, then -a2 occurs。Therefore Use instead of T in Poisson distribution, then obtain,36,The in Poisson distribution is a time interval, so it should be non negative. Hence, when the value of is negative, the above equation should be rewritten as Combining the above two equations, finally obtain: The

37、 P( f ) can be obtained from the Fourier transform of the R( ): Curves of P( f ) and R():,37,【Example 2.8】Assume the power spectral density P( f ) of a random process is shown as in the figure. Find its autocorrelation function R(). Solution: P( f ) is known, where, Curve of autocorrelation function

38、:,38,【Example 2.9】Find the autocorrelation function and the power spectral density of white noise. Solution: White noise has uniform power spectral density Pn( f ): Pn( f ) n0/2 where, n0 - single side power spectral density(W/Hz) The autocorrelation function of white noise can be obtained from its

39、power spectral density: As can be seen the samples of white noise at any two adjacent instants (i.e., 0) are uncorrelated. Average power of white noise: The above equation shows that the average power of white noise is infinity.,39,Power spectral density t1, t2, , tn) is decided only by ai , i , and

40、 bjk of every random variables, so it is a generalized stationary random process. If x1, x2, , xn are uncorrelated one another, then when j k, bjk = 0. Now, i.e., the n-dimensional joint probability density equals the product of each one dimensional probability density. If the cross-correlation func

41、tion of two random variables equals 0, then they are uncorrelated to each other; if the two dimensional joint probability density of two random variables is equal to the product of the one dimensional probability densities, then it is independent of each other. Two uncorrelated random variables are

42、not always independent of each other; and two independent random variables are certainly uncorrelated. Random variables of Gaussian process are uncorrelated and independent of one another.,43,Characteristics of probability density of normal distribution p(x) is symmetrical to x = a,i.e., p(x) is mon

43、otonically increasing in (-, a), and monotonically decreasing in (a, ), and reaches its max. at a. The maximum value is When x - or x + , p(x) 0. If a = 0, = 1,then the distribution is called standard normal distribution:,44,Normal distribution function The integral of normal probability density fun

44、ction is defined as normal distribution function. It can be expressed as: where, (x) - probability integral function: This integral is difficult to calculate. Usually table-lookup method is used instead of calculation.,45,Normal distribution expressed by error function Definition of error function:

45、Definition of Complementary error function: Expression of normal distribution,46,频率近似为fc,2.8 Narrow band random process,2.8.1 Basic concept of narrow band random process What does it mean narrow band? Assume the bandwidth of a random process is f , the central frequency is fc . If f fc , then the ra

46、ndom process is called a narrow band random process. Waveform and expression of narrow band random process Waveform and spectrum:,47,Expression where, aX(t) random envelope of narrow band random process X(t) random phase of narrow band random process 0 angular frequency of sinusoidal wave The above

47、equation can be rewritten as: where, inphase component of X (t) orthogonal component of X (t),48,2.8.2 Characteristics of narrow band random process Statistical characteristics of Xc(t) and Xs(t) If X(t) is a stationary narrow band Gaussian process with zero mean, then Xc(t) and Xs(t) are also Gauss

48、ian processes. Xc(t) and Xs(t) have identical variance, and the variance is equal to the variance of X(t) Xc and Xs at the same instant are uncorrelated and statistically independent. Statistical characteristics of aX(t) and X(t) Probability density of aX(t): Probability density of X(t):,49,2.9 Sinu

49、soidal wave plus narrow band Gaussian process,Expression of sinusoidal wave plus noise: where, A deterministic amplitude of sinusoidal wave 0 angular frequency of sinusoidal wave random phase of sinusoidal wave n(t) narrow band Gaussian noise Probability density of the envelope of r (t ): where, 2 v

50、ariance of n(t) I0() zero-order modified Bessel function pr(x) is called generalized Rayleigh distribution, or Rician distribution. When A = 0, pr(x) becomes Rayleigh probability density,50,Conditional probability density of the phase of r (t ): where, phase of r( t ) including the phase of sinusoid

51、al wave and the phase of noise pr( / ) conditional probability density of the phase of r ( t ) under the condition of giver Probability density of the phase of r (t ): When = 0, where,51,Curves of Rician distribution WhenA/ = 0, Envelope Rayleigh distribution Phase Uniform distribution When A/ is ve

52、ry large, Envelope normal distribution Phase impulse function,52,2.10 Signal transfer through linear systems,2.10.1 Basic concept of linear systems Characteristics of the linear systems discussed here Have a pair of input and a pair of output Passive Memoryless Time-invariant Causality Linear: satis

53、fying superposition principle If when input is xi(t), output is yi(t), then when input is the output is where, a1 and a2 are both arbitrary constants.,53,Sketch of linear system,54,2.10.2 Deterministic signal transfer through linear systems Time domain analysis method Let h(t) impulse response of th

54、e system x(t) input signal waveform y(t) output signal waveform then we have:,55,Frequency domain analysis method Assume input is a energy signal, let x( t ) input energy signal H( f ) Fourier transform of h( t ) X( f ) Fourier transform of x( t ) y( t ) output signal then the frequency spectral den

55、sity Y( f ) of the output signal y( t ) of the system is: y( t ) can be found from the inverse Fourier transform of Y( f ):,56,Assume: the input x( t ) is a periodic power signal, then where, the output is: If the input x( t ) is a non periodical power signal, then it will be processed as a random s

56、ignal.,57,【Example2.10】There is a RC low-pass filter as shown in Fig. 2.10.4. Find its impulse response and the expression of its output signal when the input is exponentially attenuated. Solution: Assume x(t) input energy signal y(t) output energy signal X(f) frequency spectral density of x(t) Y(f)

57、 frequency spectral density of y(t) then the transfer function of the circuit is:,58,The impulse response h(t) of the filter: The relationship between the output and the input of the filter: Assume the input x(t) equals: then the output of the filter is:,59,Conditions of distortionless transmission

58、There is a distortionless linear transmission system, and its input is an energy signal x(t), then its distortionless output signal y(t) will be where, k attenuation constant td time of delay Find the transfer function of the system: Fourier transform of the above equation: where Condition of distor

59、tionless transmission: Amplitude characteristic is independent of frequency Phase characteristic is a straight line via origin (In practice, usually we measure td instead of measure , because is difficult to measure.),60,2.10.3 Random signal transfer through linear systems For a physical realizable

60、linear system, if input is a deterministic signal, then we have If input is a stationary random signal X(t), then, output Y(t) is Mathematical expectation E Y(t) of output Y(t): Since input is a stationary random process, mathematical expectation of output:,EX(t-) = EX(t) = k,k = 常数。,61,Autocorrelat

61、ion function of output Y(t) Based on the definition of autocorrelation function, we have The integrand is independent of t1 because of the stationarity of X(t), hence we have Y(t) is a generalized stationary random process because the mathematical expectation and autocorrelation function of Y(t) are

62、 independent of t1.,62,Power spectral density PY( f ) of output Y(t): Since power spectral density is the Fourier transform of autocorrelation function, we have Let = +u v, and substitute it into the above equation, we have The power spectral density of the output signal equals the power spectral de

63、nsity of the input signal multiplied by |H( f )|2。,63,【Example 2.11】 Given the double-sided power spectral density of white noise n0/2, find the power spectral density, the autocorrelation function, and the noise power of the white noise after passing through an ideal low-pass filter. Solution: The transfer characteristic of an ideal low-pass filter is Hence, we have The power spectral density of the output signal is The autocorrelation function of the output signal is The output noise power: PY RY(0) = k2 n0 fH,64,2.11 Brief summary,

展开阅读全文
温馨提示:
1: 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
2: 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
3.本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
5. 装配图网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
关于我们 - 网站声明 - 网站地图 - 资源地图 - 友情链接 - 网站客服 - 联系我们

copyright@ 2023-2025  zhuangpeitu.com 装配图网版权所有   联系电话:18123376007

备案号:ICP2024067431-1 川公网安备51140202000466号


本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。装配图网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知装配图网,我们立即给予删除!