美文网首页
代写Machine Learning作业、代做R/Python编

代写Machine Learning作业、代做R/Python编

作者: kpcjbf8 | 来源:发表于2019-03-28 09:10 被阅读0次

    AssignmentMachine Learning Due: March 26th at 11:59PMProblem 1 [30%]In this problem, we will establish some basic properties of vectors and linear functions.1. The L2 norm of a vector measures the length (size) of a vector. The norm for a vector x of size n isdefined as:kxk2 =vuutXni=1x2iShow that the norm can be expressed as the following quadratic expression:Let a be a vector of size n and consider the following linear function f(x) = aT x. Show that thegradient of f is: xf(x) = a.3. Let A be a matrix of size n × n and consider the following quadratic function f(x) = xT Ax. Show thatthe gradient of f is: xf(x) = 2Ax.Problem 2 [40%]In this problem, you will implement a gradient descent algorithm for solving a linear regression problem. TheRSS objective in linear regression is:f(β) = ky Aβk. Consider the problem of predicting revenue as a function of spending on TV and Radio advertising.There are only 4 data points:Revenue TV Radio20 3 715 4 632 6 15 1 1Write down the matrix A and vector y for this regression problem. Do not forget about the intercept, whichcan be modeled as a feature with a constant value over all data points. The matrix A should be 4 × 3dimensional.2. Express the objective f in terms of linear and quadratic terms.3. Derive the gradient βf(β) using the linear and quadratic terms above.4. Implement a gradient descent method with a fixed step size in R/Python.5. Use your implementation of linear regression to solve the simple problem above and on a small datasetof your choice. Compare the solution with linear regression from R or Python (sklearn). Do not forget1about the intercept.Problem 3 [45%]Hint: You can follow the slides from the March 20th class, or the LAR reference from the class. See the classwebsite for some recommended linear algebra references.You will derive the formula used to compute the solution to ridge regression. The objective in ridge regressionis:f(β) = ky AβkHere, β is the vector of coefficients that we want to optimize, A is the design matrix, y is the target, and λ isthe regularization coefficient. The notation k·k2 represents the Euclidean (or L2) norm.Our goal is to find β that solves:Follow the next steps to compute it.1. Express the ridge regression objective f(β) in terms of linear and quadratic terms. Recall that. The results should be similar to the objective function of linear regression.2. Derive the gradient: βf(β) using the linear and quadratic terms above.3. Since f is convex, its minimal value is attained whenβf(β) = 0Derive the expression for the β that satisfies the inequality above.4. Implement the algorithm for β and use it on a small dataset of your choice. Do not forget about theintercept.5. Compare your solution with glmnet (or another implementation) using a small example and see if youcan make the results to be the same.2本团队核心人员组成主要包括硅谷工程师、BAT一线工程师,精通德英语!我们主要业务范围是代做编程大作业、课程设计等等。我们的方向领域:window编程 数值算法 AI人工智能 金融统计 计量分析 大数据 网络编程 WEB编程 通讯编程 游戏编程多媒体linux 外挂编程 程序API图像处理 嵌入式/单片机 数据库编程 控制台 进程与线程 网络安全 汇编语言 硬件编程 软件设计 工程标准规等。其中代写编程、代写程序、代写留学生程序作业语言或工具包括但不限于以下范围:C/C++/C#代写Java代写IT代写Python代写辅导编程作业Matlab代写Haskell代写Processing代写Linux环境搭建Rust代写Data Structure Assginment 数据结构代写MIPS代写Machine Learning 作业 代写Oracle/SQL/PostgreSQL/Pig 数据库代写/代做/辅导Web开发、网站开发、网站作业ASP.NET网站开发Finance Insurace Statistics统计、回归、迭代Prolog代写Computer Computational method代做因为专业,所以值得信赖。如有需要,请加QQ:99515681 或邮箱:99515681@qq.com 微信:codehelp

    相关文章

      网友评论

          本文标题:代写Machine Learning作业、代做R/Python编

          本文链接:https://www.haomeiwen.com/subject/sisfbqtx.html