美文网首页
【机器学习】-Week2 1. 多变量线性回归

【机器学习】-Week2 1. 多变量线性回归

作者: Kitty_风花 | 来源:发表于2019-11-30 10:48 被阅读0次

    Multiple Features

    Note:  

    Linear regression with multiple variables is also known as "multivariate linear regression".

    We now introduce notation for equations where we can have any number of input variables.

    The multivariable form of the hypothesis function accommodating these multiple features is as follows:

    In order to develop intuition about this function, we can think about θ0​ as the basic price of a house, θ1​ as the price per square meter, θ2​ as the price per floor, etc. x_1​ will be the number of square meters in the house, x_2​ the number of floors, etc.

    Using the definition of matrix multiplication, our multivariable hypothesis function can be concisely represented as:

    This is a vectorization of our hypothesis function for one training example; see the lessons on vectorization to learn more.

    Remark: Note that for convenience reasons in this course we assume 

    This allows us to do matrix operations with theta and x. Hence making the two vectors 'θ' and x_i match each other element-wise (that is, have the same number of elements: n+1)]

    来源:coursera 斯坦福 吴恩达 机器学习

    相关文章

      网友评论

          本文标题:【机器学习】-Week2 1. 多变量线性回归

          本文链接:https://www.haomeiwen.com/subject/zoqsjctx.html