ex5.m

implement regularized linear regression and use it to study models with different bias-variance properties.

  • Plot Data (in ex5data1.mat)

ex5_plotting_data.png

  • Compute Regularized Linear Regression Cost

lambda: 1, theta: [1 ; 1]

  • Compute Regularized linear regression gradient

lambda: 1, theta: [1 ; 1]

  • Train linear regression and plot fit over the data

lambda: 0

ex5_trained_linear_regression.png

  • Comput train error and cross validation error for linear regression

lambda: 0

training error: evaluate the training error on the first i training examples (i.e., X(1:i, :) and y(1:i))

cross-validation error: evaluate on the entire cross validation set (Xval and yval).

  • Plot learning curve for linear regression

Since the model is underfitting the data, we expect to see a graph with “high bias”

ex5_learning_curve_for_linear_regression.png

  • Map X onto Polynomial Features and Normalize

X_poly(i, :) = [X(i) X(i).^2 X(i).^3 … X(i).^p]

  • Train Polynomial regression and plot fit over the data

ex5_trained_polynomial_regression.png

  • Comput train error and cross validation error for polynomial regression

lambda: 0

training error: evaluate the training error on the first i training examples (i.e., X(1:i, :) and y(1:i))

cross-validation error: evaluate on the entire cross validation set (Xval and yval).

  • Plot learning curve for polynomial regression

Since the model is overfitting the data, we expect to see a graph with “high variance”

ex5_learning_curve_for_polynomial_regression.png

  • Test various values of lambda and compute error
  • Plot validation curve

use validation curve to select the “best” lambda value

the best value of lambda is around 3

ex5_validation_curve_for_polynomial_regression.png

Standord Machine Learning Class: Week7 Assignment

## ex6.m> you will be using support vector machines (SVMs) with various example 2D datasets.- Plot Data (in ex6data1.mat)![ex6_plotting_e...… Continue reading