1. 程式人生 > >【機器學習】最小二乘法求解線性迴歸引數

【機器學習】最小二乘法求解線性迴歸引數

回顧

迴歸分析之線性迴歸 中我們得到了線性迴歸的損失函式為:
J ( θ ) = 1 2

i = 1 m ( h
θ
( x ( i ) )
y ( i ) ) 2 J(\theta) = \frac{1}{2}\sum_{i=1}^m\bigg(h_\theta(x^{(i)}) - y^{(i)}\bigg)^2

θ \theta 的求解夠

  1. 將損失函式用向量的形式表示:

J ( θ ) = 1 2 i = 1 m ( h θ ( x ( i ) ) y ( i ) ) 2 = 1 2 ( X θ Y ) T ( X θ Y ) J(\theta) = \frac{1}{2}\sum_{i=1}^m\bigg(h_\theta(x^{(i)}) - y^{(i)}\bigg)^2 = \frac{1}{2}(X\theta-Y)^T(X\theta-Y)

  • 其中:

h θ ( x ( i ) ) = θ 1 x 1 ( i ) + θ 2 x 2 ( i ) + . . . + θ m x m ( i ) h_{\theta}(x^{(i)}) = \theta_1x_1^{(i)}+\theta_2 x_2^{(i)} + ...+\theta_mx_m^{(i)}

X = [ x 1 ( 1 ) x 2 ( 1 ) . . . x n ( 1 ) x 1 ( 2 ) x 2 ( 2 ) . . . x n ( 2 ) . . . x 1 ( m ) x 2 ( m ) . . . x n ( m ) ] X = \left [ \begin{matrix} x_1^{(1)} & x_2^{(1)} & ... & x_n^{(1)} \\ x_1^{(2)} & x_2^{(2)} & ... & x_n^{(2)} \\ && ... &\\ x_1^{(m)} & x_2^{(m)} & ... & x_n^{(m)} \\ \end{matrix} \right ]

θ = [ θ 1 θ 2 . . . θ n ] \theta = \left [ \begin{matrix} \theta_1 \\ \theta_2 \\ ...\\ \theta_n \end{matrix} \right ]

Y = [ y ( 1 ) y ( 2 ) . . . y ( m ) ] Y= \left [ \begin{matrix} y^{(1)} \\ y^{(2)}\\ ...\\ y^{(m)} \end{matrix} \right ]

  • m 表示樣本的數量,n 表示每個樣本的特徵數量