美文网首页
非线性回归(logistic regression)

非线性回归(logistic regression)

作者: foochane | 来源:发表于2018-01-18 23:26 被阅读23次

    1. 基本模型

    测试数据为X(x0,x1,x2···xn)

    要学习的参数为: Θ(θ0,θ1,θ2,···θn)

    2. Cost函数

    线性回归:

    非线性回归 Logistic regression:


    目标:找到合适的 θ0,θ1使上式最小

    3.解法:梯度下降(gradient decent)

    更新法则:

    学习率:
    同时对所有的θ进行更新,重复更新直到收敛

    4.代码

    import numpy as np
    import random
    
    def genData(numPoints,bias,variance):
        x = np.zeros(shape=(numPoints,2))
        y = np.zeros(shape=(numPoints))
        for i in range(0,numPoints):
            x[i][0]=1
            x[i][1]=i
            y[i]=(i+bias)+random.uniform(0,1)+variance
        return x,y
    
    def gradientDescent(x,y,theta,alpha,m,numIterations):
        xTran = np.transpose(x)
        for i in range(numIterations):
            hypothesis = np.dot(x,theta)
            loss = hypothesis-y
            cost = np.sum(loss**2)/(2*m)
            gradient=np.dot(xTran,loss)/m
            theta = theta-alpha*gradient
            print ("Iteration %d | cost :%f" %(i,cost))
        return theta
    
    x,y = genData(100, 25, 10)
    print("x:")
    print(x)
    print("y:")
    print(y)
    
    m,n = np.shape(x)
    n_y = np.shape(y)
    
    print("m:"+str(m)+" n:"+str(n)+" n_y:"+str(n_y))
    
    numIterations = 100000
    alpha = 0.0005
    theta = np.ones(n)
    theta= gradientDescent(x, y, theta, alpha, m, numIterations)
    print(theta)
    





                【注】:本文为麦子学院机器学习课程的学习笔记

    相关文章

      网友评论

          本文标题:非线性回归(logistic regression)

          本文链接:https://www.haomeiwen.com/subject/simgoxtx.html