ML之LS&OLS:LS&OLS演算法的簡介、論文、演算法的改進(最佳子集選擇法、前向逐步迴歸法)、程式碼實現等詳細攻略
阿新 • • 發佈:2018-12-11
ML之LS&OLS:LS&OLS演算法的簡介、論文、演算法的改進(最佳子集選擇法、前向逐步迴歸法)、程式碼實現等詳細攻略
LS&OLS演算法的簡介
OLS是在大約200 年前(1806年)由高斯(Gauss)和法國數學家阿德里安- 馬裡· 勒讓德(Legendre)提出的。
LS&OLS演算法的論文
LS&OLS演算法的演算法的改進(最佳子集選擇法、前向逐步迴歸法)
1、最佳子集選擇法虛擬碼實現
Initialize: Out_of_sample_error = NULL Break X and Y into test and training sets for i in range(number of columns in X): for each subset of X having i+1 columns: fit ordinary least squares model Out_of_sample_error.append(least error amoung subsets containing i+1 columns) Pick the subset corresponding to least overall error
2、前向逐步迴歸法虛擬碼實現
Initialize: ColumnList = NULL Out-of-sample-error = NULL Break X and Y into test and training sets For number of column in X: For each trialColumn (column not in ColumnList): Build submatrix of X using ColumnList + trialColumn Train OLS on submatrix and store RSS Error on test data ColumnList.append(trialColumn that minimizes RSS Error) Out-of-sample-error.append(minimum RSS Error)
LS&OLS演算法的程式碼實現