【吳恩達機器學習筆記】Week4 ex3答案
阿新 • • 發佈:2018-12-11
與上一週作業相同,按著公式程式設計就好了
function [J, grad] = lrCostFunction(theta, X, y, lambda) %LRCOSTFUNCTION Compute cost and gradient for logistic regression with %regularization % J = LRCOSTFUNCTION(theta, X, y, lambda) computes the cost of using % theta as the parameter for regularized logistic regression and the % gradient of the cost w.r.t. to the parameters. % Initialize some useful values m = length(y); % number of training examples % You need to return the following variables correctly J = 0; grad = zeros(size(theta)); % ====================== YOUR CODE HERE ====================== % Instructions: Compute the cost of a particular choice of theta. % You should set J to the cost. % Compute the partial derivatives and set grad to the partial % derivatives of the cost w.r.t. each parameter in theta % % Hint: The computation of the cost function and gradients can be % efficiently vectorized. For example, consider the computation % % sigmoid(X * theta) % % Each row of the resulting matrix will contain the value of the % prediction for that example. You can make use of this to vectorize % the cost function and gradient computations. % % Hint: When computing the gradient of the regularized cost function, % there're many possible vectorized solutions, but one solution % looks like: % grad = (unregularized gradient for logistic regression) % temp = theta; % temp(1) = 0; % because we don't add anything for j = 0 % grad = grad + YOUR_CODE_HERE (using the temp variable) % h = sigmoid(X*theta); J = 1.0/m*(-y'*log(h)-(1-y)'*log(1-h)) + lambda/(2*m)*theta(2:length(theta))'*theta(2:length(theta)); grad(1) = 1.0/m*X(:,1)'*(h-y); [~, n_X] = size(X); grad(2:length(grad)) = 1.0/m*X(:,2:n_X)'*(h-y) + lambda/m*theta(2:length(theta)); % ============================================================= grad = grad(:); end
在多分類的時候卡了一下,對y==c那裡沒有理解,實際上多分類是由多個分類器組成,因此,在以前,只訓練1列(N+1)行的theta,在K個分類時,訓練K個分類器,所以是得到K行(N+1)列的theta
備註:(N+1)作為行或者作為列對結果影響不大,只要理解(N+1)代表的是Feature,K代表的是label種類即可
因此多分類的theta與單分類的theta相比,僅僅是多了對於其他類別的概率預測而已,所以是以矩陣的形式呈現
在實際使用多分類theta預測的時候,只要將theta與X相乘(你也可以看作是,多個logistic的theta分別與X相乘),得到不同label各自的概率,然後選出概率最高的作為最終預測結果即可
function [all_theta] = oneVsAll(X, y, num_labels, lambda) %ONEVSALL trains multiple logistic regression classifiers and returns all %the classifiers in a matrix all_theta, where the i-th row of all_theta %corresponds to the classifier for label i % [all_theta] = ONEVSALL(X, y, num_labels, lambda) trains num_labels % logistic regression classifiers and returns each of these classifiers % in a matrix all_theta, where the i-th row of all_theta corresponds % to the classifier for label i % Some useful variables m = size(X, 1); n = size(X, 2); % You need to return the following variables correctly all_theta = zeros(num_labels, n + 1); % Add ones to the X data matrix X = [ones(m, 1) X]; % ====================== YOUR CODE HERE ====================== % Instructions: You should complete the following code to train num_labels % logistic regression classifiers with regularization % parameter lambda. % % Hint: theta(:) will return a column vector. % % Hint: You can use y == c to obtain a vector of 1's and 0's that tell you % whether the ground truth is true/false for this class. % % Note: For this assignment, we recommend using fmincg to optimize the cost % function. It is okay to use a for-loop (for c = 1:num_labels) to % loop over the different classes. % % fmincg works similarly to fminunc, but is more efficient when we % are dealing with large number of parameters. % % Example Code for fmincg: % % % Set Initial theta % initial_theta = zeros(n + 1, 1); % % % Set options for fminunc % options = optimset('GradObj', 'on', 'MaxIter', 50); % % % Run fmincg to obtain the optimal theta % % This function will return theta and the cost % [theta] = ... % fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), ... % initial_theta, options); % all_theta = zeros(num_labels, n + 1); for c = 1:num_labels initial_theta = zeros(n + 1, 1); options = optimset('GradObj', 'on', 'MaxIter', 50); [theta] = ... fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), ... initial_theta, options); all_theta(c,:) = theta'; end % ========================================================================= end
function p = predictOneVsAll(all_theta, X)
%PREDICT Predict the label for a trained one-vs-all classifier. The labels
%are in the range 1..K, where K = size(all_theta, 1).
% p = PREDICTONEVSALL(all_theta, X) will return a vector of predictions
% for each example in the matrix X. Note that X contains the examples in
% rows. all_theta is a matrix where the i-th row is a trained logistic
% regression theta vector for the i-th class. You should set p to a vector
% of values from 1..K (e.g., p = [1; 3; 1; 2] predicts classes 1, 3, 1, 2
% for 4 examples)
m = size(X, 1);
num_labels = size(all_theta, 1);
% You need to return the following variables correctly
p = zeros(size(X, 1), 1);
% Add ones to the X data matrix
X = [ones(m, 1) X];
% ====================== YOUR CODE HERE ======================
% Instructions: Complete the following code to make predictions using
% your learned logistic regression parameters (one-vs-all).
% You should set p to a vector of predictions (from 1 to
% num_labels).
%
% Hint: This code can be done all vectorized using the max function.
% In particular, the max function can also return the index of the
% max element, for more information see 'help max'. If your examples
% are in rows, then, you can use max(A, [], 2) to obtain the max
% for each row.
%
p_pro = X*all_theta';
[~, p] = max(p_pro, [], 2);
% =========================================================================
end
function p = predict(Theta1, Theta2, X)
%PREDICT Predict the label of an input given a trained neural network
% p = PREDICT(Theta1, Theta2, X) outputs the predicted label of X given the
% trained weights of a neural network (Theta1, Theta2)
% Useful values
m = size(X, 1);
num_labels = size(Theta2, 1);
% You need to return the following variables correctly
p = zeros(size(X, 1), 1);
% ====================== YOUR CODE HERE ======================
% Instructions: Complete the following code to make predictions using
% your learned neural network. You should set p to a
% vector containing labels between 1 to num_labels.
%
% Hint: The max function might come in useful. In particular, the max
% function can also return the index of the max element, for more
% information see 'help max'. If your examples are in rows, then, you
% can use max(A, [], 2) to obtain the max for each row.
%
X = [ones(m,1) X];
z_2 = X* Theta1';
a_2 = sigmoid(z_2);
m_2 = size(a_2, 1);
a_2 = [ones(m_2,1) a_2];
z_3 = a_2* Theta2';
a_3 = sigmoid(z_3);
[~, p] = max(a_3,[],2);
% =========================================================================
end