從本周開始難度明顯加大,必須了解機率論的一些基本概念和公式。還好牛人們已經做了詳細解答。以下是對我們學習非常有幫助的資源:
一、Logistic回歸總結
二、機率論03 條件機率
三、最大似然估計(Maximum likelihood estimation)
四、Stanford機器學習---第三講. 邏輯回歸和過拟合問題的解決 logistic Regression & Regularization
有了尤其是第四個連結中女神的幫忙,想不懂都難啊
==========================以下是部分代碼==================
<pre name="code" class="python">
function [J, grad] = costFunction(theta, X, y)
%COSTFUNCTION Compute cost and gradient for logistic regression
% J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the
% parameter for logistic regression and the gradient of the cost
% w.r.t. to the parameters.
% Initialize some useful values
m = length(y); % number of training examples
% You need to return the following variables correctly
J = 0;
grad = zeros(size(theta));
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
% You should set J to the cost.
% Compute the partial derivatives and set grad to the partial
% derivatives of the cost w.r.t. each parameter in theta
%
% Note: grad should have the same dimensions as theta
%
tmp = X * theta ;
h= sigmoid(tmp) ;
J = sum( -y' * log(h) - (1-y') * log(1-h)) / m;
grad = 1 / m * (( h - y )' * X );
% =============================================================
end
function [J, grad] = costFunctionReg(theta, X, y, lambda)
%COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
% J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
% theta as the parameter for regularized logistic regression and the
% gradient of the cost w.r.t. to the parameters.
% Initialize some useful values
m = length(y); % number of training examples
% You need to return the following variables correctly
J = 0;
grad = zeros(size(theta));
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
% You should set J to the cost.
% Compute the partial derivatives and set grad to the partial
% derivatives of the cost w.r.t. each parameter in theta
theta0 = [0;theta(2:end)];
tmp = X * theta ;
h= sigmoid(tmp) ;
J = sum( -y' * log(h) - (1-y') * log(1-h)) / m + lambda/(2*m) * sum(theta0.^2);
grad = 1 / m * (( h - y )' * X ) + (lambda / m ) * theta0' ;
% =============================================================
end