美文网首页
SMO 算法

SMO 算法

作者: bdd1b3ad7323 | 来源:发表于2019-01-01 19:33 被阅读7次

    [未完成]
    我们要求解如下问题
    \mathop{\min}_{\alpha} \frac{1}{2}\sum_{i=1}^{N}\sum_{j=1}^{N} \alpha_i\alpha_jy_iy_jK(x_i,x_j)-\sum_{i=1}^{N}\alpha \\ s.t. \sum_{i=1}^{N}\alpha_iy_i=0\\ 0\leq\alpha_i\leq C, i = 1,2,...N
    可以考虑固定住其余变量,只优化\alpha_1\alpha_2,即将问题转化为
    \mathop{\min}_{\theta} W(\alpha_1, \alpha_2)=\frac{1}{2} K_{11}\alpha_1^2+\frac{1}{2} K_{22}\alpha_2^2+y_1y_2K_{12}\alpha_1\alpha_2\\- (\alpha_1+\alpha_2)+y_1\alpha_1\sum_{i=3}^{N}y_i\alpha_iK_{i1}+y_2\alpha_2\sum_{i=3}^{N}y_i\alpha_iK_{i2}\\ s.t. \qquad \alpha_1y_1+\alpha_2y_2=-\sum_{i=3}^{N}\alpha_iy_i=\zeta\\ 0\leq\alpha_i\leq C, i = 1,2
    上式中去掉了不含\alpha_1\alpha_2的项,为了表示方便,我们令g(x_i)=\sum_{i=1}^N\alpha_iy_iK(x_i,x)+b v_i=\sum_{j=3}^{N}\alpha_jy_jK_{ij}=g(x_i)-\sum_{j=1}^2\alpha_jy_jK_{ij}-b
    代入原方程,并将\alpha_1\alpha_2表示出,代入原方程,得出关于\alpha_2的二次凸函数,对该函数进行优化(取最小值)。
    采用的方法是求导,找唯一驻点,如果驻点在定义域内,则将\alpha_2更新为驻点,否则更新为靠驻点一侧的边界


    得到导数后,我们令 \frac{\partial W}{\partial \alpha_2}=0 ,得到
    (K_{11}+K_{12}-2K_{12})\alpha_2^{new,unc} = \\ y_2\left[y_2-y_1+(\alpha_{1}^{old}y_1+\alpha_{2}^{old}y_2)(K_{11}-K_{12})\\+\left(g(x_1)-\sum_{j=1}^{2}y_j\alpha_{j}^{old}K_{1j}-b\right)\\-\left(g(x_2)-\sum_{j=1}^{2}y_j\alpha_{j}^{old}K_{2j}-b\right)\right] \\ =y_2(\alpha_{1}^{old}y_1K_{11}+\alpha_{2}^{old}y_2K_{11}-\alpha_{1}^{old}y_1K_{12}-\alpha_{2}^{old}y_2K_{12}\\-y_1\alpha_{1}^{old}K_{11}-y_2\alpha_{2}^{old}K_{12}+y_1\alpha_{1}^{old}K_{21}+y_2\alpha_{2}^{old}K_{22}+g(x_1)-g(x_2)+b-b+y_2-y_1)\\ =y_2((K_{11}+K_{12}-2K_{12})\alpha_{2}^{old}y_2+(g(x_1)-y_1)-(g(x_2)-y_2)) \\
    最后的更新公式为
    \alpha_{2}^{new,unc}=\alpha_{2}^{old}+\frac{y_2(E_1-E_2)}{\eta}

    相关文章

      网友评论

          本文标题:SMO 算法

          本文链接:https://www.haomeiwen.com/subject/psanlqtx.html