美文网首页
【机器学习】-Week2 4. 特征和多项式回归

【机器学习】-Week2 4. 特征和多项式回归

作者: Kitty_风花 | 来源:发表于2019-11-30 10:49 被阅读0次

Features and Polynomial Regression

We can improve our features and the form of our hypothesis function in a couple different ways.

We can combine multiple features into one. For example, we can combine x_1​ and x_2 into a new feature x_3 by taking x_1x1​⋅x_2x2​.

Polynomial Regression

Our hypothesis function need not be linear (a straight line) if that does not fit the data well.

We can change the behavior or curve of our hypothesis function by making it a quadratic, cubic or square root function (or any other form).

For example, if our hypothesis function is 

then we can create additional features based on x_1x1​, to get the quadratic function 

​ or the cubic function 

In the cubic version, we have created new features x_2 and x_3​ where 

To make it a square root function, we could do: 

One important thing to keep in mind is, if you choose your features this way then feature scaling becomes very important.

来源:coursera 斯坦福 吴恩达 机器学习

相关文章

网友评论

      本文标题:【机器学习】-Week2 4. 特征和多项式回归

      本文链接:https://www.haomeiwen.com/subject/ncjsjctx.html