Machine Learning Andrew Ng -4. Linear Regression with multiple variables
4.1 Multiple features (多特征量)
Multiple features (variables)
Size | Number of bedrooms | Number of floors | Age of homes | Price |
---|---|---|---|---|
2104 | 5 | 1 | 45 | 460 |
1416 | 3 | 2 | 40 | 232 |
1534 | 3 | 2 | 30 | 315 |
852 | 2 | 1 | 36 | 178 |
… | … | … | … | … |
Notation :
- = number of features
- = input (features) of training example
- = value of feature in training example
Hypothesis :
previously :
now :
Multivariate linear regression 多元线性回归
4.2 Gradient descent for multiple variables
How to fit the parameters of that hypothesis ? How to use gradient descent for linear regression with multiple features ?
4.3 Gradient descent in practice I : Feature Scaling(特征缩放)
Feature Scaling : Get every feature into approximately a range.
Mean normalization (均值归一化) :
4.4 Gradient descent in practice II : Learning rate
4.5 Features and polynomial regression (特征和多项式回归)
Choosing feature
The price could be a quadratic function (二次函数), or a cubic function (三次函数)
now feature scaling is more important
How to choose features ? Discuss later…
4.6 Normal equation (正规方程)
Normal equation : Method to solve for analytically.
One step, you get to the optimal value right there.
这里X矩阵上下标大概率写错了
Feature Scaling is no need
When choose gradient descent when choose normal equation ?