1 / 31

毕业论文

毕业论文. 利用部分特征向量信息的改进 GMRES 方法. 指导老师 赵金熙 教授. 计算数学 专业 九七 级 硕士 生. 姓名 周钟. 答辩日期 2000 年 6 月 10 日. C o n t e n t s. Chapter 1. Introduction. Chapter 2. Methods for Solving Nonsysmmetric Linear System.

nike
Télécharger la présentation

毕业论文

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 毕业论文 利用部分特征向量信息的改进GMRES方法 指导老师赵金熙 教授 计算数学专业九七 级 硕士 生 姓名周钟 答辩日期 2000年6月10日

  2. C o n t e n t s Chapter 1. Introduction Chapter 2. Methods for Solving Nonsysmmetric Linear System Chapter 3. An Improved GMRES Method Augmented with Eigenvectors Chapter 4. Reference

  3. 1.1The Basis Process of Computational Science and Engineering 1.2The Background for Solving Large Nonsymmetric System

  4. Construct a model of the physical problem domain. Apply boundary conditions. Develop a numerical approximation to the governing equations. Compute Validate the results Understand the results

  5. (1.1) (1.2) (1.4) (1.3) (1.6) (1.5) (1.8) (1.7) (1.9)

  6. (1.10) (1.11) (1.12)

  7. = (1.13)

  8. 2.1 CGNR Method 2.2 Lanczos-type Methods 2.3 GMRES Method 2.4 Other Iterate Methods

  9. 2.2.1 BiCG Method 2.2.2 CGS Method 2.2.3 BiCGSTAB Method 2.2.4 QMR Method

  10. xk∈x0+{r0, Ar0, … , Ak-1r0} rk=Pk(A)r0

  11. = for i<j 0 = = (2.3)

  12. = … (2.5) for j<i

  13. I. Look-Ahead Process where s=1,2,…,k-1 where 1=n1<n2<…<nkn<nk+1 j,s=1,2,…,k (2.6) where Ds is nonsingular if s=1,2,…,k-1 andDk is nonsingular of n=nk+1-1.

  14. II. Quasi-minimal residual approach (2.8) (2.9)

  15. Arnoldi process (2.10) Minimal residual approach (2.11) (2.12)

  16. 3.1 Introduction 3.2 Adding Approximate Eigenvectors To The Subspace Orderly 3.3 Implementation 3.4 Numerical Experiments 3.5 Conclusion

  17. Theorem 1[12]: Suppose that A is diagonalizable so that A=XDX-1 and let (3.2) Then the residual norm provided at the m step of GMRES satisfies ||rm+1||≤K(X)ε(m) ||r0|| (3.3) where K(X)=||X||||X-1||

  18. Theorem 2[12]: Assume that there are v eigenvalues λ1,λ2,… ,λv of A with nonpositive real parts and let the other eigenvalues be enclose in a circle centered at C, with C>0 and having radius R with C>R.then (3.4) where AND

  19. Theorem 3[20]:Suppose A has spectral decomposition A=Z∧Z-1,with all the eigenvalues being real and positive. Assuming that the initial guess x0 is the zero vector, we have (3.5) where K= λn/λ1

  20. Theorem 4[20]:suppose A has spectral decomposition A= Z∧Z-1,with all the eigenvalues being real and positive. Assume that the minimum residual solution is extracted from the subspace Span{b, Ab,…,Am-1b,z1,z2,…,zk}.where zi’s are column of Z. Then (3.6) where r=b-Ax is the residual vector, Ke=λn/λk+1.

  21. Corollary 1 Suppose A has spectrum decomposition A= Z∧Z-1, withλi<0, i≤k, 0<λk+1≤λk+2≤…≤λn, assume that the initial x0 is the zero vector, the minimum residual solution is extracted from the subspace Span{ b, Ab,…,Am-1b,z1,z2,…,zk },where zi are columns of Z. then (3.7) where Ke=λn/λk+1

  22. Theoem 5 : Suppose A has spectral decomposition A= Z∧Z-1 ,with ∧ diagonal. Suppose the GMRES method added eigenvectors orderly used with k approximate eigenvectors y1,y2,…,yk. Let ψi≡∠(yi, zi) and let β be the coefficient of zi in the expansion of b then (3.8) Where q is a polynomial of degree m or less such that q(0)=1,m is the dimension of the original krylov subspace.

  23. Corollary 2 Under the assumes in theorem 5, and assume all the eigenvalues are real and positive, then (3.12) (3.13) where Ke=λn/λk+1.

  24. (3.14) (3.15) (3.16) (3.17) W*A*Wg i=(1/θi )W*A*AWg i (3.18) Fg i =(1/θi ) Gg i yi=Wgi

  25. Expense: Algorithm 1. n(m+n+1)mvps Algorithm 2. (m+n+1)n mvps if n≤s, (m+2s)n+(1-s)s mvps if n>s Algorithm 3. mvps Algorithm 4. [(m+s)n+s]mvps Storage requirment : Algorithm 1-3. (m+2s) vectors of length n Algorithm 4. (m+2s) ,(m+s) vectors of length n

  26. Fig 1.0 : Ex 1. A(300,300), d=(0.1, … ,0.9,1,2, … ,291)

  27. Fig 1.1 : Ex 1. A(300,300), d=(0.1, … ,0.9,1,2, … ,291)

  28. Fig 2.0 : Ex 2. A(300,300), d=(-5, … ,-1,1,2, … ,295)

  29. Fig 3.0 : Ex 3. A(1000,1000), d=(1,1.01, … ,1.04,2,3, … ,996)

  30. Fig 4.0 : Ex 4. A(100,100), d=(1, … ,1), sd=0.9

  31. 致 谢 在三年研究生阶段的学习生活即将结束之际,我要衷心感谢我的导师赵金熙老师。在学习和研究的过程中我始终得到了赵老师的亲切关怀,我的每一点进步和提高都和他的精心指导和悉心培养密不可分。赵老师的严谨的治学态度和对研究工作精益求精的精神给了我深刻的影响,并将使我终身受益。 在此还要感谢曾在同一课题组共同学习和工作过的我的师兄归丽忠硕士,以及刘青昆硕士和舒继武博士。和他们一起热烈讨论,互相鼓励,精诚合作的情景仍历历在目,令人难以忘怀。 特别感谢我的师兄范红军硕士,本论文的顺利完成得到了他的积极的帮助和建议。 最后,还要感谢所有给予过我关心, 支持和帮助的同学。他们勤奋努力,踏实认真的求学态度是我学习的榜样。和他们的真诚的友谊将是我以后人生路上的宝贵财富。

More Related