美文网首页
[Week 3] Machine-learning Notes

[Week 3] Machine-learning Notes

作者: 东皇Amrzs | 来源:发表于2016-12-13 17:20 被阅读41次

    第三周概述

    Welcome to week 3! This week, we’ll be covering logistic regression. Logistic regression is a method for classifying data into discrete outcomes. For example, we might use logistic regression to classify an email as spam or not spam. In this module, we introduce the notion of classification, the cost function for logistic regression, and the application of logistic regression to multi-cla ss classification.

    We are also covering regularization. Machine learning models need to generalize well to new examples that the model has not seen in practice. We’ll introduce regularization, which helps prevent models from overfitting the training data.

    As always, if you get stuck on the quiz and programming assignment, you should post on the Discussions to ask for help. (And if you finish early, I hope you'll go there to help your fellow classmates as well.)

    分类(Classfication)

    第一部分完全是为了引入分类的假设函数,傻子都能看出来不能用线性回归模型去拟合,于是就引入了另一种模型,逻辑回归模型。

    Hypothesis Representation

    逻辑回归的假设函数的表达式如下。

    下一节我想让你对假设函数到底是什么有一个更直观的认识 ,也即将引出一个新的概念叫做判定边界

    决策边界(Decision Boundary )

    我们认为判断分类的依据是以0.5为界限,观察z函数的图像,我们很容易得出,g(z)>= 0.5 就是意味着(theta的转置 x) >= 0。*

    所以假如我们已经知道theta的值,可以根据这个来得出这个决策边界,这个边界体现在图中就是概率为1的时候,x可以取的范围。

    这一节中的theta的值是假设的,下一节将给出到底如何来拟合出这些theta值。

    相关文章

      网友评论

          本文标题:[Week 3] Machine-learning Notes

          本文链接:https://www.haomeiwen.com/subject/homvmttx.html