Regression
Linear models
https://blog.csdn.net/red_stone1/article/details/81122926
Linear models for classification
Linear regression:
Logistic regression:
Support vector machine:
K-nearest neighbors: classification or regression
- In k-NN classification, the output is a class membership. An object is classified by a plurality vote of its neighbors, with the object being assigned to the class most common among its k nearest neighbors (k is a positive integer, typically small). If k = 1, then the object is simply assigned to the class of that single nearest neighbor.
- In k-NN regression, the output is the property value for the object. This value is the average of the values of k nearest neighbors.
Markov networks: probability sum up to 1. to predict what are gonna happen now.
Bayes networks:
Difference:
A Markov network or MRF is similar to a Bayesian network in its representation of dependencies; the differences being that Bayesian networks are directed and acyclic, whereas Markov networks are undirected and may be cyclic. The underlying graph of a Markov random field may be finite or infinite.
Other terms:
Markov chain/process: discrete-time steps, continuous-time steps
Stochastic process: times of things happen before a fixed time or time step between two things.
Decision Tree
Random Forest
Neural network
Classification
Logistic regression
Support vector machine
Naive Bayes
Decision Tree
Random Forest
Neural network
Reference:
https://blog.csdn.net/xlm289348/article/details/8876353
https://zhuanlan.zhihu.com/p/34562485
网友评论