LSA
sklearn包里就能很好实现LSA降维并返回矩阵,先用TfIdfVectorizer构造文本矩阵,然后用decomposition.TruncatedSVD实现SVD矩阵变换,记得传入保留维度数量否则默认为2的,可参考:
http://scikit-learn.org/stable/modules/generated/sklearn.decomposition.TruncatedSVD.html
遗忘算法博客
**http://blog.csdn.net/gzdmcaoycviewmode=contents
**
自然语言处理:
http://www.cs.columbia.edu/~mcollins/
ACL: http://www.aclweb.org/anthology/
Ansj使用手册
http://nlpchina.github.io/ansj_seg/
DP
https://bitbucket.org/michaelchughes/bnpy-dev/overview
概率密度函数/概率质量函数
https://zh.wikipedia.org/wiki/%E6%A6%82%E7%8E%87%E8%B4%A8%E9%87%8F%E5%87%BD%E6%95%B0
伯努利分布
https://zh.wikipedia.org/wiki/%E4%BC%AF%E5%8A%AA%E5%88%A9%E5%88%86%E5%B8%83
EM算法
http://www.csuldw.com/2015/12/02/2015-12-02-EM-algorithms/
数学符号
http://baike.baidu.com/view/37054.htm
边缘分布
https://en.wikipedia.org/wiki/Marginal_distribution
tensorflow
使用GBDT选取特征
http://www.letiantian.me/2015-03-31-use-gbdt-to-select-features/tensorflow:
http://www.leiphone.com/news/201511/UDLyNds2oSTwM2yZ.html
python for data analysis 中文版:
http://pda.readthedocs.io/en/latest/chp4.html
CNN动图
http://cs231n.github.io/assets/conv-demo/index.html
神经网络
http://yosinski.com/deepvis
http://vision03.csail.mit.edu/cnn_art/index.html
https://github.com/yosinski/deep-visualization-toolbox
alexnet用caffe的一些定义
https://github.com/BVLC/caffe/blob/master/models/bvlc_alexnet/train_val.prototxt
玻尔兹曼机
http://202.197.191.206:8080/30/text/chapter06/6_3.htm
http://valser.org/thread-833-1-1.html
https://www.google.com/patents/CN103530689A?cl=zh
http://www.nvidia.cn/object/machine-learning-cn.html
http://shujuren.org/article/258.html
Kmeans聚类Python实战:http://sobuhu.com/ml/2012/11/25/kmeans-python.html
text summarization
https://www.quora.com/Has-Deep-Learning-been-applied-to-automatic-text-summarization-successfully
网友评论