机器学习算法——线性回归LinearRegression

作者: 皮皮大 | 来源:发表于2019-09-30 07:08 被阅读0次

    线性回归法

    思想

    • 解决回归问题
    • 算法可解释性强
    • 一般在坐标轴中:横轴是特征(属性),纵坐标为预测的结果,输出标记(具体数值)

    分类问题中,横轴和纵轴都是样本特征属性(肿瘤大小,肿瘤发现时间)

    问题产生

    image.png
    • 求解出拟合的直线y=ax+b
    • 根据样本点x^{(i)},求解预测值\hat y^{(i)}
    • 求解真实值和预测值的差距尽量小 ,通常用差的平方和最小表示,损失函数为:\mathop {min}\sum ^{m}_{i=1} (y^{(i)}-{\hat {y^{(i)}}})^2
      \mathop {min}\sum ^{m}_{i=1} ({y^{i}-ax^{(i)}-b})^2
    • 上面的损失函数loss function实际上就是求解a,b

    最小二乘法求解a,b

    求解损失函数J(a,b)的过程:J(a,b) = \mathop {min}\sum ^{m}_{i=1} ({y^{i}-ax^{(i)}-b})^2
    分别对a,b求导,在令导数为0,进行求解最终结果为:

    image.png
    • 先对b求导


      image.png
      image.png
    • 对a求导:


      image.png
    image.png

    a的另一种表示形式:

    image.png

    向量化过程

    向量化主要是针对a的式子来进行改进,将:分子看做w^{(i)},v^{(i)},分母看做w^{(i)},w^{(i)}

    image.png image.png
    import numpy as np
    
    class SimpleLinearRegression1(object):
        def __init__(self):
            # ab不是用户送进来的参数,相当于是私有的属性
            self.a_ = None
            self.b_ = None
        
        def fit(self, x_train,y_train):
            # fit函数:根据训练数据集来得到模型
            assert x_train.ndim == 1, \
                "simple linear regression can only solve single feature training data"
            assert len(x_train) == len(y_train), \
                "the size of x_train must be equal to the size of y_train"
    
            x_mean = np.mean(x_train)
            y_mean = np.mean(y_train)
    
            num = 0.0
            d = 0.0
            for x, y in zip(x_train, y_train):
                num += (x - x_mean) * (y - y_mean)
                d += (x - x_mean) ** 2
            
            self.a_ = num / d
            self.b_ = y_mean - self.a_ * x_mean
            
            # 返回自身,sklearn对fit函数的规范
            return self
        
        def predict(self, x_predict):
            # 传进来的是待预测的x 
            assert x_predict.ndim == 1, \
                "simple linear regression can only solve single feature training data"
            assert self.a_ is not None and self.b_ is not None, \
                "must fit before predict!"
                
            return np.array([self._predict(x) for x in x_predict])
        
        def _predict(self, x_single):
            # 对一个数据进行预测 
            return self.a_ * x_single + self.b_
        
        def __repr__(self):
            # 字符串输出
            return "SimpleLinearRegression1()"
        
      
     # 通过向量化实现
    class SimpleLinearRegression2(object):
        def __init__(self):
            # a, b不是用户送进来的参数,相当于是私有的属性
            self.a_ = None
            self.b_ = None
        
        def fit(self, x_train, y_train):
            # fit函数:根据训练数据集来得到模型
            assert x_train.ndim == 1, \
                "simple linear regression can only solve single feature training data"
            assert len(x_train) == len(y_train), \
                "the size of x_train must be equal to the size of y_train"
    
            x_mean = np.mean(x_train)
            y_mean = np.mean(y_train)
            
            #  改成向量形式代替for循环,numpy中的.dot形式
            #  参考上面的向量化公式 
            num = (x_train - x_mean).dot(y_train - y_mean)
            d = (x_train - x_mean).dot(x_train - x_mean)
            
            self.a_ = num / d
            self.b_ = y_mean - self.a_ * x_mean
            
            # 返回自身,sklearn对fit函数的规范
            return self
        
        def predict(self, x_predict):
            # 传进来的是待预测的x 
            assert x_predict.ndim == 1, \
                "simple linear regression can only solve single feature training data"
            assert self.a_ is not None and self.b_ is not None, \
                "must fit before predict!"
                
            return np.array([self._predict(x) for x in x_predict])
        
        def _predict(self, x_single):
            # 对一个数据进行预测 
            return self.a_ * x_single + self.b_
        
        def __repr__(self):
            # 字符串函数,输出方便进行查看
            return "SimpleLinearRegression2()"
    

    衡量标准

    衡量标准:将数据分成训练数据集train和测试数据集test,通过训练数据集得到a和b,再通过测试数据集进行衡量

    image.png
    • 均方误差MSE,mean squared error,存在量纲问题MSE=\frac {1}{m}\sum ^{m}_{i=1}(y^{(i)}_{test}-\hat y^{(i)}_{test})^2
    • 均方根误差RMSE,root mean squared error,RMSE=\sqrt{MSE_{test}}=\sqrt {\frac {1}{m}\sum ^{m}_{i=1}(y^{(i)}_{test}-\hat y^{(i)}_{test})^2}
    • 平均绝对误差MAE,mean absolute error,MAE=\frac {1}{m}\sum^{m}_{i=1}|y^{(i)}_{test}-\hat y^{(i)}_{test}|

    sklearn中没有RMSE,只有MAE、MSE

    import numpy as np
    from math import sqrt
    
    
    def accuracy_score(y_true, y_predict):
        '''准确率的封装:计算y_true和y_predict之间的准确率'''
        assert y_true.shape[0] == y_predict.shape[0], \
        "the size of y_true must be equal to the size of y_predict"
    
        return sum(y_true ==y_predict) / len(y_true)
    
    
    def mean_squared_error(y_true, y_predict):
        # 计算y_true 和 y_predict之间的MSE
        assert len(y_true) == len(y_predict), \
            "the size of y_true must be equal to the size of y_predict"
        return np.sum((y_true - y_predict)**2) / len(y_true)
    
    
    def root_mean_squared_error(y_true, y_predict):
        # 计算y_true 和 y_predict之间的RMSE
        return sqrt(mean_squared_error(y_true, y_predict))
    
    
    def mean_absolute_error(y_true, y_predict):
        # 计算y_true 和 y_predict之间的MAE
        assert len(y_true) == len(y_predict), \
            "the size of y_true must be equal to the size of y_predict"
        
        return np.sum(np.absolute(y_true - y_predict)) / len(y_true)
    
    image.png

    R^2指标

    R^2指标的定义为### R^2指标
    R^2指标的定义为R^2=1- \frac {SS_{residual}}{SS_{total}}
    R^2=1-\frac {\sum_i{(\hat y^{(i)}-y^{(i)}})^2}{\sum_i{(\bar y-y^{(i)}})^2}

    image.png image.png

    分子为模型预测产生的误差;分母为使用均值产生的误差(baseline model产生的误差)

    式子表示为:预测模型没有产生误差的指标

    • R^2 \leq 1
    • R^2越小越好。R^2最大值为1,此时预测模型不犯误差。模型等于基准模型时,R^2为0
    • R^2小于0,此时学习到的模型还不如基准模型,说明数据可能不存在线性关系
    • R^2的另一种表示为R^2=1-\frac {MSE(\hat y,y)}{Var(y)}Var表示方差
    image.png

    多元线性回归

    将特征数从1拓展到了N,求解思路和一元线性回归类似。


    image.png

    目标函数


    image.png image.png image.png
    image.png
    image.png image.png

    相关文章

      网友评论

        本文标题:机器学习算法——线性回归LinearRegression

        本文链接:https://www.haomeiwen.com/subject/ygtdyctx.html