Multiple Features
Note:
Linear regression with multiple variables is also known as "multivariate linear regression".
We now introduce notation for equations where we can have any number of input variables.
The multivariable form of the hypothesis function accommodating these multiple features is as follows:
In order to develop intuition about this function, we can think about θ0 as the basic price of a house, θ1 as the price per square meter, θ2 as the price per floor, etc. x_1 will be the number of square meters in the house, x_2 the number of floors, etc.
Using the definition of matrix multiplication, our multivariable hypothesis function can be concisely represented as:
This is a vectorization of our hypothesis function for one training example; see the lessons on vectorization to learn more.
Remark: Note that for convenience reasons in this course we assume
This allows us to do matrix operations with theta and x. Hence making the two vectors 'θ' and x_i match each other element-wise (that is, have the same number of elements: n+1)]
来源:coursera 斯坦福 吴恩达 机器学习
网友评论