I'm in the second week of Professor Andrew Ng's Machine Learning course through Coursera. We're working on linear regression and right now I'm dealing with coding the cost function.
The code I've written solves the problem correctly when i excecute each line one by one in the octave command line but returns 0 when i run it from the file computeCost.m file in Octave.
The code I've got so far.
function J = computeCost(X, y, theta)
%COMPUTECOST Compute cost for linear regression
% J = COMPUTECOST(X, y, theta) computes the cost of using theta as the
% parameter for linear regression to fit the data points in X and y
% Initialize some useful values
%m = length(y) % number of training examples
% You need to return the following variables correctly
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta
% You should set J to the cost.
m = size(X,1);
predictions= X*theta;
sqrErrors= (predictions-y).^2;
J= 1/(2*m) * sum(sqrErrors);
=========================================================================
end
I've set
X=[1 1; 1 2; 1 3] y=[1;2;3] and theta=[0,1]
When I execute the above lines one by one in command prompt i get J= 0 and J=2.333 when theta=[0;0].But when i run the same code from the file computeCost.m from octave command line i always get J=0 no matter what value i set for theta..Please help
computeCost.min a file. And thencomputeCost(X,y,[0;1]) = 0, andcomputeCost(X,y,[0;0]) ans = 2.3333. Both are correct. Try to print out the intermediate result to debug.pwd.