数值分析实验一(线性方程组的求解 基于matlab实现)

本文详细介绍了四种常用的迭代法求解线性方程组:Jacobi方法、Gauss-Seidel方法、SOR(Successive Over-Relaxation)方法及共轭梯度法。通过函数实现展示了每种方法的计算过程,包括预处理步骤和核心迭代公式。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

Jacobi Method

The Jacobi Method is a form of fixed-point iteration. Let D denote the main diagonal

of A, L denote the lower triangle of A (entries below the main diagonal), and U denote the

upper triangle (entries above the main diagonal). Then A = L + D + U, and the equation

to be solved is Lx + Dx + Ux = b. Note that this use of L and U differs from the use

in the LU factorization, since all diagonal entries of this L and U are zero. The system of

equations Ax = b can be rearranged in a fixed-point iteration of form:

function [x]=jacobi(A,x0,b)
D=diag(diag(A));
L=tril(A,-1);
U=triu(A,1);
B=-inv(D)*(L+U);
g=inv(D)*b;
for i=1:15
    x=B*x0+g;
    x0=x;
end

 

Gauss–Seidel Method:

Closely related to the Jacobi Method is an iteration called the Gauss–Seidel Method. The

only difference between Gauss–Seidel and Jacobi is that in the former, the most recently

updated values of the unknowns are used at each step, even if the updating occurs in the

current step.

function [x]=gauss(A,x0,b)
D=diag(diag(A)); 
L=tril(A,-1);   
U=triu(A,1);
B = -inv(D+L)*U;
g = inv(D+L)*b; 
for k=1:15
    x=B*x0+g;
    x0=x;
end

 

Successive Over-Relaxation Method:

The method called Successive Over-Relaxation (SOR) takes the Gauss–Seidel direc-

tion toward the solution and “overshoots’’ to try to speed convergence. Let ω be a real number, and define each component of the new guess x k+1 as a weighted average of ω

times the Gauss–Seidel formula and 1 − ω times the current guess x k . The number ω is

called the relaxation parameter, and ω > 1 is referred to as over-relaxation.

(L+D+U)x=b

 

function [x]=sor(A,x0,b,w)
D=diag(diag(A)); 
L=tril(A,-1);  
U=triu(A,1);
B = inv(D+w*L)*((1-w)*D-w*U);
g = w*inv(D+w*L)*b;
for k=1:15
    x=B*x0+g;
    x0=x;
end

Conjugate Gradient Method;

So,

 

function [x0]=conjugate(A,x0,b)
d0 = b-(A*x0);
r0 = d0;
for k=1:15
    a = (norm(r0))^2/(d0'*A*d0); 
    x0 = x0+a*d0; 
    r1 = r0-a*A*d0;
    beta = (norm(r1)^2)/(norm(r0)^2);
    d0 = r1+beta*d0;
    r0 = r1;
end

 

Conjugate Gradient Method with Jacobi preconditioner.

 

 

Then:

function [x0]=preconditionConjugate(A,x0,b)
D = diag(diag(A));
M = D;
r0 = b-(A*x0);
z0 = inv(M)*r0;
d0 = z0;
for k=1:15  
    a = (r0'*z0)/(d0'*A*d0); 
    x0 = x0+a*d0;
    r1 = r0-a*A*d0;
    z1 = inv(M)*r1;
    beta = (r1'*z1)/(r0'*z0);
    d0 = r1+beta*d0;
    r0 = r1; z0 = z1;
end

 

实验:复化辛普森公式求定积分 1.理解复化梯形公式、复化Simpson公式、Romberg方法和复化Gauss-Legendre公式计算的概念 2.掌握Newton-Cotes求积公式的原理,包括了解这些公式的误差及代数精度,参考课本写出复化辛普森算法的程序,在matlab实现,并用matlab内置的函数计算,进行误差分析。 实验二:非线性方程求解 内容:用般迭代法与Newton迭代法求解非线性方程的根,讨论迭代函数对收敛性的影响,初值的选取对迭代法的影响,收敛性与收敛速度的比较。 要求:熟练掌握二分法及Newton迭代法编程的语法,学会使用Matlab函数solve、fzero、fsolve求解非线性方程(组)的解。 实验三:线性方程组的数值解法 内容:用Matlab语言实现Gauss算法和cholesky算法以及Lu分解,求解线性方程组。 要求:本实验考察学生综合运用Matlab进行编程的能力。根据Gauss算法和cholesky算法以及Lu分解法要求,自行设计编程方案,实现算法。对实际问题,能够建立线性方程组,用自己的程序进行求解问题。 实验四:解线性方程组的迭代法 内容:用Matlab语言实现Jacobi迭代算法、Gauss-Seidel迭代算法、逐次超松弛迭代法和共轭梯度法,求解般的线性代数方程组问题。 要求:本实验考察学生综合运用Matlab进行编程的能力。根据Jacobi迭代算法、Gauss-Seidel迭代算法、逐次超松弛迭代法和共轭梯度法要求,自行设计编程方案,实现算法。对比较复杂的线性方程组,可以使用自己的程序进行求解
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值