马尔科夫过程介绍课件
《马尔科夫过程介绍课件》由会员分享,可在线阅读,更多相关《马尔科夫过程介绍课件(25页珍藏版)》请在装配图网上搜索。
1、马尔科夫过程介绍4.9 APPLICATIONS TO MARKOV CHAINS1121100210应用物理系应用物理系王允磊王允磊马尔科夫过程介绍What is Markov chains?The Markov chains described in this section are used as mathematical models of a wide variety of situations in biology, business, chemistry, engneering, physics, and elsewhere. In each case, the model is
2、 used to describe an experiment or measurement that is performed many times in the same way, where the outcome of each trial of the experiment will be one of several specified possible outcomes, and where the outcome of one trial depends only on the immediately preceding trial.马尔科夫过程介绍For example, i
3、f the population of a city and its suburbs were measured each year, then a vector such as 40. 060. 00 xcould indicate that 60% of the population lives in the city and 40% in the suburbs. The decimals in x0 add up to 1 because they account for the entire population of the region. Percentages are more
4、 convenient for our purpose here then population totals. 马尔科夫过程介绍A vector with nonnegative entries that add up to 1 is called a probability vector. A stochastic matrix is a square matrix whose columns are probability vectors. A Markov chain is a sequence of probability vectors x0, x1, x2, , together
5、 with a stochastic matrix P, such thatThus the Markov chain is described by the first-order difference equationWhen a Markov chain of vectors in Rn describes a system or a sequence of experiments, the entries in xk list, respectively, the probabilities that the system is in each of n possible states
6、, or the probabilities that the outcome of the experiment that is one of n possible outcomes. For this reason, xk is often called a state vector.马尔科夫过程介绍马尔科夫过程介绍So the population distribution could be马尔科夫过程介绍Similarly, the distribution in 2002 is described by a vector x2, where马尔科夫过程介绍What is Markov
7、 matrix?An nxn matrix whose form satisfies two properties below1.All entries 0;2.All columns add to 1;is called a Markov matrix. Such as As you can see, the definition of the Markov matrix is closely related to Markov chains and probability theories.We can also derivate this result:Lemma: the powers
8、 of a Markov matrix is still a Markov matrix马尔科夫过程介绍Lemmas proof:We could use mathematical induction to complete this lemmas proof.Besides, we could prove a more stronger result:Any two Markov matrices production is still a Markov matrix.Proof:Think about two Markov matrices A and B, let Let C=AB, w
9、e can get nnnnnnaaaaaaaaaA212222111211nnnnnnbbbbbbbbbB212222111211niiiiaaaa21 nkkjnkkjniiknknikjikninkkjikniijnjinjijinkkjikijbbababacbabababac1111111122111j.every for 1)()()( so马尔科夫过程介绍So the matrix C has the property 2: all columns add to 1.Obviously, it has the property 1: all entries 0. Then we
10、have proved that C is a Markov matrix.At last we use induction and could easily prove the lemma.马尔科夫过程介绍Markov matrix other important properties1.=1 is an eigenvalue.2.all other eigenvalues, in absolute value, smaller than 1. |1.马尔科夫过程介绍The voting results of a congressional election are represented
11、by a vector x in R3If we record the outcome of the election every two years by the above vector and the outcome of one election depends only on the results of the preceding election.Then the sequence of vectors that describe the votes every two years may be a Markov chain.马尔科夫过程介绍We can take the sto
12、chastic matrix P asIt means that马尔科夫过程介绍Determine the likely outcome of the next election and the likely outcome of the election after that.马尔科夫过程介绍马尔科夫过程介绍How to predict the distant future?The most interesting aspect of Markov chains is the study of a chains long-term behaver. For instance, what ca
13、n be said in Example 2 about the voting after many elections have passed(assuming that the given stochastic matrix continues to describe the transition percentages from one election to the next)? Or, what happens to the population distribution in Example 1 in the long run? Before answering these que
14、stions, we turn to a numerical example.马尔科夫过程介绍001 and 4 .0 .2 .3 .8 .3 .3 .2 .5 .Let 0 xP马尔科夫过程介绍The results of further calculations are shown belowThese vectors seem to be approaching q=0.3 0.6 0.1T马尔科夫过程介绍For the vector q, we can verify the below equation(with no rounding error)When the system is
15、 in state q, there is no change in the system from one measurement to the next. A vector q like this called a steady-state vector(or equilibrium vector)for P马尔科夫过程介绍马尔科夫过程介绍How to get the steady-state vector? 马尔科夫过程介绍We choose a basis for the solution space. A simple choice is To make it as a steady
16、-state vector, we divide w by the sum of its entries and obtain 马尔科夫过程介绍The book has this sentence: “It can be shown that every stochastic matrix has a steady-state vector for P”. Next we will prove this claim.Proof:Let M=P-I and let M=For matrix P, we know that the columns of P are probability vect
17、ors. So the sum of elements of every column vector of P is 1. Obviously, the sum of elements of every column vector of M is 0.It means thatSo the row vectors of matrix M are linear dependent. So the determinant of matrix M is 0.According the nature of the equation MX=0 we just prove its solution spa
18、ce isnt zero.3210321马尔科夫过程介绍DEFINITION:1.We say that a stochastic matrix is regular if some matrix power Pk contains only strictly positive entries. 2.We say that a sequence of vectors converges to a vector if the entries in xk can be made as close as desired to the corresponding entries in q by taking k sufficiently large. 马尔科夫过程介绍一般的马尔科夫过程一般的马尔科夫过程
- 温馨提示:
1: 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
2: 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
3.本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
5. 装配图网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。