天天看點

Computing the cost J(θ)----吳恩達機器學習作業1.Question2. my code

文章目錄

  • 1.Question
  • 2. my code

1.Question

As you perform gradient descent to learn minimize the cost function J(θ),it is helpful to monitor the convergence by computing the cost. In thissection, you will implement a function to calculate J(θ) so you can check the convergence of your gradient descent implementation.Your next task is to complete the code in the le computeCost.m, which is a function that computes J(θ). As you are doing this, remember that the

variables X and y are not scalar values, but matrices whose rows represent the examples from the training set.Once you have completed the function, the next step in ex1.m will run computeCost once using θ initialized to zeros, and you will see the cost printed to the screen.

You should expect to see a cost of 32.07.

2. my code

function J = computeCost(X, y, theta)
%COMPUTECOST Compute cost for linear regression
%   J = COMPUTECOST(X, y, theta) computes the cost of using theta as the
%   parameter for linear regression to fit the data points in X and y

% Initialize some useful values
m = length(y); % number of training examples

% You need to return the following variables correctly 
J = 0;

% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta
%               You should set J to the cost.
Y=theta'*X';
Y=Y';
dis=Y-y;
sum=0;
i=1;
for i=1:m
    sum=sum+dis(i,:)^2;
end
J =1/(2*m)* sum;





% =========================================================================

end

           

繼續閱讀