美文网首页
线性回归 及 梯度下降(代码实现)

线性回归 及 梯度下降(代码实现)

作者: lilicat | 来源:发表于2019-01-22 18:23 被阅读0次

    重点

    1 特征归一化
    2 损失函数
    3 梯度下降

    特征归一化

    def norm(feature):
        matmin = feature.min(axis=0)
        feature -= matmin
        matmax = feature.max(axis=0)
        if feature.ndim >1:
            matmax[matmax == 0] = 1
        elif matmax == 0 and feature.ndim==1:
            matmax = 1
        feature /= matmax
        return feature
    

    损失函数

    def computeCost(X, y, theta):
        inner = np.power(((X * theta.T) - y), 2)
        return np.sum(inner) / (2 * len(X))
    

    梯度下降法

    def gradientDescent(X, y, theta, alpha, iters):
        temp = np.matrix(np.zeros(theta.shape))
        parameters = int(theta.ravel().shape[1])
        cost = np.zeros(iters+2)
    
        for i in range(iters+1):
            cost[i] = computeCost(X, y, theta)
            error = (X * theta.T) - y
            for j in range(parameters):
                term = np.multiply(error, X[:,j])
                temp[0,j] = theta[0,j] - ((alpha / len(X)) * np.sum(term))
    
            theta = temp
            cost[i+1] = computeCost(X, y, theta)
            if cost[i+1] >= cost[i] or cost[i+1] < 0.001:
                print ("Stopping at round %d, and the cost is %1.4f " % (i,cost[i]))
                print (cost)
                return theta, cost
            if (i+1)%100 ==1:
                print ("round %d, and the cost is %1.4f  " % (i,cost[i+1]) )
        print ("Stopping at round %d, and the cost is %1.4f " % (iters,cost[iters+1]))
        print (cost)
        return theta, cost  
    

    相关文章

      网友评论

          本文标题:线性回归 及 梯度下降(代码实现)

          本文链接:https://www.haomeiwen.com/subject/zlerjqtx.html