摘要
当系数矩阵和观测向量都包含误差时,基于总体最小二乘法的平差模型要优于普通最小二乘法。奇异值分解法和Euler-Lagrange逼近算法是总体最小二乘法普遍采用的两种回归参数估计方法。但其推导过程要涉及Eckart-Young-Mirsky矩阵逼近理论等,复杂难懂,令许多学习者难以接受,也使其应用受到了限制。引入回归参数估计的一种新迭代算法,其理论依据严谨充分,推导过程清晰易懂,具体计算也容易编程实现。通过实际算例来验证该方法的可行性与有效性,测试结果表明该算法得出的线性回归模型具有良好的拟合效果。
The adjustment model based on total least squares is superior to the ordinary least squares when coefficient matrix and observation vector include errors.Singular value decomposition method and Euler-Lagrange approximation algorithm are two commonly used regression parameter estimation methods for total least squares,but its derivation process involves Eckart-Young-Mirsky matrix approximation theory,which is complicated and difficult to understand,making many learners feel unacceptable,which limits its application.Therefore,this paper introduces a new iterative algorithm for regression parameter estimation.The theoretical basis is rigorous and sufficient,the derivation process is clear and easy to understand,and the specific calculation is also easy to program.Practical examples are used to verify the feasibility and effectiveness of the method.The test results show that the linear regression model obtained by the algorithm has a good fitting effect.
作者
曹邦兴
Cao Bangxing(Sontan College,Guangzhou University,Guangzhou 511370,China)
出处
《大理大学学报》
CAS
2020年第6期1-6,共6页
Journal of Dali University
关键词
总体最小二乘法
回归参数
奇异值分解法
迭代算法
显著性检验
total least squares
regression parameters
singular value decomposition method
iterative algorithm
significance tests