天天看點

coursera machine learning Linear Regression octave程式設計作業

1.octave送出作業

1.1下載下傳octave

1.1.1網盤下載下傳

連結:https://pan.baidu.com/s/1of7sWiqovaKBBRFGEihUmQ

提取碼:8d9u

1.1.2官網下載下傳

http://ftp.gnu.org/gnu/octave/windows/

1.1.3 下載下傳matlab 算了 太大 我不想。。。

1.2打開和運作octave

a.如果你的檔案沒有exe運作的這個檔案,可以打開octave.vbs

b.如果你從官網下載下傳的,有桌面快捷方式,就直接運作,上方選擇你想要運作的路徑就好

coursera machine learning Linear Regression octave程式設計作業

1.2.1更改路徑

最簡單的辦法,将官網下載下傳的檔案machine-learning-ex1複制在上面octave所在的檔案夾

其他添加路徑的參考方式:

p = 'C:\Users\li\Desktop\machine-learning-ex1'
addpath(p)
           

1.2.2其他配置

有一些函數,官網下載下傳的配置檔案(machine-learning-ex1)裡面,都是有的,一定記得下載下傳

coursera machine learning Linear Regression octave程式設計作業
coursera machine learning Linear Regression octave程式設計作業

1.2.3運作代碼

代碼可以參考

https://blog.csdn.net/goddywu/article/details/100220646

或者

https://www.jianshu.com/p/9066919072d4

bug⚠️ findstr is obsolete; use strfind instead

原因 用pause()函數無法響應按鍵事件,詳見 https://www.mobibrw.com/2019/18501 目前運作隻能把ex1.m的pause;逐個注釋掉

複制大佬的代碼,粘貼,輸入submit()即可

大佬們的代碼樣例:

warmUpExercise.m :

function A = warmUpExercise()
%WARMUPEXERCISE Example function in octave
%   A = WARMUPEXERCISE() is an example function that returns the 5x5 identity matrix

A = [];
% ============= YOUR CODE HERE ==============
% Instructions: Return the 5x5 identity matrix 
%               In octave, we return values by defining which variables
%               represent the return values (at the top of the file)
%               and then set them accordingly. 
A = eye(5);
% ===========================================
end
           

computeCost.m

function J = computeCost(X, y, theta)
%COMPUTECOST Compute cost for linear regression
%   J = COMPUTECOST(X, y, theta) computes the cost of using theta as the
%   parameter for linear regression to fit the data points in X and y

% Initialize some useful values
m = length(y); % number of training examples

% You need to return the following variables correctly 
J = 0;

% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta
%               You should set J to the cost.

h_theta = X*theta;
J = 1/2/m * sum((h_theta-y).^2);

% =========================================================================

end
           

gradientDescent.m

function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
%GRADIENTDESCENT Performs gradient descent to learn theta
%   theta = GRADIENTDESCENT(X, y, theta, alpha, num_iters) updates theta by 
%   taking num_iters gradient steps with learning rate alpha

% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);

for iter = 1:num_iters

    % ====================== YOUR CODE HERE ======================
    % Instructions: Perform a single gradient step on the parameter vector
    %               theta. 
    %
    % Hint: While debugging, it can be useful to print out the values
    %       of the cost function (computeCost) and gradient here.
    %

    theta = theta - alpha*(1/m)*X'*(X*theta-y);

    % ============================================================

    % Save the cost J in every iteration    
    J_history(iter) = computeCost(X, y, theta);

end
end
end
submit()
           

1.2.3送出作業

submit()

warning: addpath: ./lib: No such file or directory

warning: called from

submit at line 2 column 3

warning: addpath: ./lib/jsonlab: No such file or directory

warning: called from

submitWithConfiguration at line 2 column 3

submit at line 45 column 3

== Submitting solutions | Linear Regression with Multiple Variables…

Use token from last successful submission ([email protected])? (Y/n): Y

==

== Part Name | Score | Feedback

== --------- | ----- | --------

== Warm-up Exercise | 10 / 10 | Nice work!

== Computing Cost (for One Variable) | 40 / 40 | Nice work!

== Gradient Descent (for One Variable) | 50 / 50 | Nice work!

== Feature Normalization | 0 / 0 |

== Computing Cost (for Multiple Variables) | 0 / 0 |

== Gradient Descent (for Multiple Variables) | 0 / 0 |

== Normal Equations | 0 / 0 |

== --------------------------------

== | 100 / 100 |

==

submit的檔案你需要先運作一遍,計算機才會緩存這個函數,你送出的時候輸入submit()才有反應

祝學習愉快!

繼續閱讀