1. 程式人生 > >非線性-Logistic(邏輯斯特)迴歸

非線性-Logistic(邏輯斯特)迴歸

關於演算法這塊,這周看了斯坦福大學Andrew Ng的公開課。還是極力推薦的,每節課10分鐘左右,講的思路清晰,內容豐富,程式設計作業也很值得去做。(在這裡好想吐槽一下國內培訓機構七月演算法的培訓視訊呀,根本看不下去。)

上課形式是這樣滴

程式設計作業提交是這樣滴

Nice work.每次都是一百分也挺有成就感的。每週的程式設計作業會有一份非常詳細的pdf文件解釋

pdf文件是這樣滴

上課也挺搞笑的,很願意聽下去。關鍵是,免費

每集的內容我也下載下來了,等全部整理完畢會上傳到網盤。有需要請留言。

然後工具就換成了Octave/Matlab,果然還是Matlab用起來比較順手。

下圖分別是用邏輯迴歸對非線性邊界處理的效果,因為非線性,需要將特徵組合成多種多項式,擴大特徵,這樣容易出現過擬合的現象,於是需要正則化。下圖分別是正則化係數取0,1,100的效果。取0相當於不正則化。
散點圖

lamdba=0,顯然出現過擬合(overfitting)
lamdba=1,效果比較好
lamdba=100,顯然欠擬合(underfitting)

具體Matlab程式碼如下:
ex2_reg.m

%% Machine Learning Online Class - Exercise 2: Logistic Regression
%
%  Instructions
%  ------------
%
%  This file contains code that helps you get started on the second part
% of the exercise which covers regularization with logistic regression. % % You will need to complete the following functions in this exericse: % % sigmoid.m % costFunction.m % predict.m % costFunctionReg.m % % For this exercise, you will not need to change any code in this file,
% or any other files other than those mentioned above. % %% Initialization clear ; close all; clc %% Load Data % The first two columns contains the X values and the third column % contains the label (y). data = load('ex2data2.txt'); X = data(:, [1, 2]); y = data(:, 3); plotData(X, y); % Put some labels hold on; % Labels and Legend xlabel('Microchip Test 1') ylabel('Microchip Test 2') % Specified in plot order legend('y = 1', 'y = 0') hold off; %% =========== Part 1: Regularized Logistic Regression ============ % In this part, you are given a dataset with data points that are not % linearly separable. However, you would still like to use logistic % regression to classify the data points. % % To do so, you introduce more features to use -- in particular, you add % polynomial features to our data matrix (similar to polynomial % regression). % % Add Polynomial Features % Note that mapFeature also adds a column of ones for us, so the intercept % term is handled X = mapFeature(X(:,1), X(:,2)); % Initialize fitting parameters initial_theta = zeros(size(X, 2), 1); % Set regularization parameter lambda to 1 lambda = 1; % Compute and display initial cost and gradient for regularized logistic % regression [cost, grad] = costFunctionReg(initial_theta, X, y, lambda); fprintf('Cost at initial theta (zeros): %f\n', cost); fprintf('Expected cost (approx): 0.693\n'); fprintf('Gradient at initial theta (zeros) - first five values only:\n'); fprintf(' %f \n', grad(1:5)); fprintf('Expected gradients (approx) - first five values only:\n'); fprintf(' 0.0085\n 0.0188\n 0.0001\n 0.0503\n 0.0115\n'); fprintf('\nProgram paused. Press enter to continue.\n'); pause; % Compute and display cost and gradient % with all-ones theta and lambda = 10 test_theta = ones(size(X,2),1); [cost, grad] = costFunctionReg(test_theta, X, y, 10); fprintf('\nCost at test theta (with lambda = 10): %f\n', cost); fprintf('Expected cost (approx): 3.16\n'); fprintf('Gradient at test theta - first five values only:\n'); fprintf(' %f \n', grad(1:5)); fprintf('Expected gradients (approx) - first five values only:\n'); fprintf(' 0.3460\n 0.1614\n 0.1948\n 0.2269\n 0.0922\n'); fprintf('\nProgram paused. Press enter to continue.\n'); pause; %% ============= Part 2: Regularization and Accuracies ============= % Optional Exercise: % In this part, you will get to try different values of lambda and % see how regularization affects the decision coundart % % Try the following values of lambda (0, 1, 10, 100). % % How does the decision boundary change when you vary lambda? How does % the training set accuracy vary? % % Initialize fitting parameters initial_theta = zeros(size(X, 2), 1); % Set regularization parameter lambda to 1 (you should vary this) lambda = 1; % Set Options options = optimset('GradObj', 'on', 'MaxIter', 400); % Optimize [theta, J, exit_flag] = ... fminunc(@(t)(costFunctionReg(t, X, y, lambda)), initial_theta, options); % Plot Boundary plotDecisionBoundary(theta, X, y); hold on; title(sprintf('lambda = %g', lambda)) % Labels and Legend xlabel('Microchip Test 1') ylabel('Microchip Test 2') legend('y = 1', 'y = 0', 'Decision boundary') hold off; % Compute accuracy on our training set p = predict(theta, X); fprintf('Train Accuracy: %f\n', mean(double(p == y)) * 100); fprintf('Expected accuracy (with lambda = 1): 83.1 (approx)\n');

具體方法實現就不想貼了