SVMs
代价函数

支持向量机做的全部事情,就是极小化参数向量θ范数的平方,或者说长度的平方。
内积θ‘ x^((i))而变成了p^((i))⋅∥θ∥。
p^((i))用来表示这是第 i个训练样本在参数向量θ上的投影。
核函数

可视化数据集

改变惩罚系数C
C= 1

C= 1000

当C不是非常非常大的时候,它可以忽略掉一些异常点的影响,得到更好的决策界。
SVM with Gaussian



设置适当的C和sigma
function [C, sigma] = dataset3Params(X, y, Xval, yval)
%DATASET3PARAMS returns your choice of C and sigma for Part 3 of the exercise
%where you select the optimal (C, sigma) learning parameters to use for SVM
%with RBF kernel
% [C, sigma] = DATASET3PARAMS(X, y, Xval, yval) returns your choice of C and
% sigma. You should complete this function to return the optimal C and
% sigma based on a cross-validation set.
%
% You need to return the following variables correctly.
C = 1;
sigma = 0.3;
% ====================== YOUR CODE HERE ======================
% Instructions: Fill in this function to return the optimal C and sigma
% learning parameters found using the cross validation set.
% You can use svmPredict to predict the labels on the cross
% validation set. For example,
% predictions = svmPredict(model, Xval);
% will return the predictions on the cross validation set.
%
% Note: You can compute the prediction error using
% mean(double(predictions ~= yval))
%
para_c = [0.01, 0.03, 0.1, 0.3, 1, 3, 10, 30];
para_s = para_c;
error = 10;
for i=1:length(para_c)
for j=1:length(para_c)
model = svmTrain(X, y, para_c(i), @(x1, x2) gaussianKernel(x1, x2, para_s(j)));
predict = svmPredict(model, Xval);
cur = mean(double(predict ~= yval));
if cur < error
error = cur;
C = para_c(i);
sigma = para_s(j);
end
end
end
% =========================================================================
end

