线性模型之最小二乘法
Least square (最小二乘法)
线性模型
可以重写为向量形式
其中y 为常量,
通常来说,输出y是一个k维向量,则β是一个(p + 1) * k维的矩阵
最小二乘法
选择系数矩阵β使得在数据集上,预测值与真实值的距离平方和最小。
RSS是二次函数,它的最小值总是存在的。
Sklearn示例代码
import matplotlib.pyplot as plt
import numpy as np
from sklearn import datasets, linear_model
from sklearn.metrics import mean_squared_error, r2_score
#Load the diabetes dataset
diabetes = datasets.load_diabetes()
diabetes_X = diabetes.data[:, np.newaxis, 2]
# Split the data into training/testing sets
diabetes_X_train = diabetes_X[:-20]
diabetes_X_test = diabetes_X[-20:]
# Split the target into training/testing sets
diabetes_y_train = diabetes.target[:-20]
diabetes_y_test = diabetes.target[-20:]
# Create a learning regression object
regr = linear_model.LinearRegression()
# Train the model using the training sets
regr.fit(diabetes_X_train, diabetes_y_train)
# Make predictions using the testing set
diabetes_y_pred = regr.predict(diabetes_X_test)
# The coefficients
print('Coefficients: \n', regr.coef_)
# The mean squared error
mse = mean_squared_error(diabetes_y_test, diabetes_y_pred)
print("Mean squared eror: %.2f" % mse)
# Explained variance score: 1 is perfect prediction
print('Variable score: %.2f' % r2_score(diabetes_y_test, diabetes_y_pred))
# Plot outputs
plt.scatter(diabetes_X_test, diabetes_y_test, color='black')
plt.plot(diabetes_X_test, diabetes_y_pred, color='blue', linewidth=3)
plt.xticks(())
plt.yticks(())
plt.show()
print(mse)
缺点
- 最小二乘法基于线性模型,线性模型的表达能力不足,不能够拟合真实世界中的非线性关系;
- 如果样本X的维度很高,即(XTX)-1
是一个PXP维的矩阵,而且P很大,求其逆矩阵将很复杂。