多变量线性回归
\[h_\theta(x) = \theta_0 + \theta_1 x_1 + \theta_2 x_2\ + \cdots + \theta_n x_n\]
令\(x_0 = 0\),上述公式可以简化为:
\[h_\theta(x) = \theta^TX\]
多变量梯度下降
\[\theta_j := \theta_j - \alpha\frac{\partial}{\partial\theta_j}J(\theta_0,\theta_1,\cdots,\theta_n)\]
求导后,即为:
\[\theta_j := \theta_j - \alpha\frac{1}{m}\sum_{i=1}^m((h_\theta(x_{(i)})-y_{(i)})\times x_j^{(i)})\]