HMM学习

作者: 华山论剑 | 来源:发表于2017-02-03 20:34 被阅读44次

原址:http://www.shokhirev.com/nikolai/abc/alg/hmm/hmm.html
慢慢学习更新中。。。

定义:The Hidden Markov Model (HMM) is a variant of a finite state machine having a set of hidden states, Q, an output alphabet (observations), O, transition probabilities, A, output (emission) probabilities, B, and initial state probabilities, Π. The current state is not observable. Instead, each state produces an output with a certain probability (B). Usually the states, Q, and outputs, O, are understood, so an HMM is said to be a triple, ( A, B, Π ).

Formal Definition:

Hidden states Q = { qi }, i = 1, . . . , N .

Transition probabilities A = {aij = P(qj at t +1 | qi at t)}, where P(a | b) is the conditional probability of a given b, t = 1, . . . , T is time, and qi in Q. Informally, A is the probability that the next state is qj given that the current state is qi.

Observations (symbols) O = { ok }, k = 1, . . . , M .

Emission probabilities B = { bik = bi(ok) = P(ok | qi) }, where ok in O. Informally, B is the probability that the output is ok given that the current state is qi.

Initial state probabilities Π = {pi = P(qi at t = 1)}.

Canonical problems

There are 3 canonical problems to solve with HMMs:

Given the model parameters, compute the probability of a particular output sequence. This problem is solved by the Forward and Backward algorithms (described below).
Given the model parameters, find the most likely sequence of (hidden) states which could have generated a given output sequence. Solved by the Viterbi algorithm and Posterior decoding.
Given an output sequence, find the most likely set of state transition and output probabilities. Solved by the Baum-Welch algorithm

Forward Algorithm

Let αt(i) be the probability of the partial observation sequence Ot = {o(1), o(2), ... , o(t)} to be produced by all possible state sequences that end at the i-th state.

αt(i) = P(o(1), o(2), ... , o(t) | q(t) = qi ).

Then the unconditional probability of the partial observation sequence is the sum of αt(i) over all N states.

The Forward Algorithm is a recursive algorithm for calculating αt(i) for the observation sequence of increasing length t . First, the probabilities for the single-symbol sequence are calculated as a product of initial i-th state probability and emission probability of the given symbol o(1) in the i-th state. Then the recursive formula is applied. Assume we have calculated αt(i) for some t. To calculate αt+1(j), we multiply every αt(i) by the corresponding transition probability from the i-th state to the j-th state, sum the products over all states, and then multiply the result by the emission probability of the symbol o(t+1). Iterating the process, we can eventually calculate αT(i), and then summing them over all states, we can obtain the required probability.

Formal Definition

Initialization:

α1(i) = pi bi(o(1)) , i =1, ... , N

Recursion:

here i =1, ... , N , t =1, ... , T - 1

Termination:

参考:

相关文章

  • HMM学习

    原址:http://www.shokhirev.com/nikolai/abc/alg/hmm/hmm.html慢...

  • 04 隐马尔可夫模型 - HMM的三个问题 - 预测问题 - V

    03 隐马尔可夫模型 - HMM的三个问题 - 学习问题 八、HMM的三个问题 - 预测问题 问题: 在观测序列已...

  • 03 隐马尔可夫模型 - HMM的三个问题 - 学习问题 - B

    02 隐马尔可夫模型 - HMM的三个问题 - 概率计算问题 七、HMM的三个问题 - 学习问题 若训练数据包含观...

  • 隐马尔科夫模型HMM

    直接上链接吧 1.聊聊隐马尔科夫模型(HMM) 2.一文搞懂HMM 3.HMM-python实例

  • HMM基础

    一、HMM建模 HMM参数: 二、HMM的3个假设 (一)马尔科夫假设 (二)观测独立性假设 (三)不变性假设 转...

  • 03-隐马可夫模型(HMM)二

    1、HMM问题一:求观测序列问题(直接计算) 首先我们回顾下HMM模型的问题一。这个问题是这样的。我们已知HMM模...

  • Transformer面试基础:

    HMM 和 CRF 区别: 1.HMM是生成模型,CRF是判别模型 2.HMM是概率有向图,CRF是概率无向图 3...

  • PGM(Probabilistic Graphical Mode

    前言 由于要准备学习GATK中的一些算法,所以要学习HMM(Hidden Markov models),于是就掉进...

  • 马尔科夫模型的几个子模型

    马尔可夫链(MC):机器学习 隐马尔可夫模型(HMM):机器学习 马尔科夫决策过程(MDP):强化学习 MDP见:...

  • HMM

    结巴分词: TreeDAGroute概率hmm 收到一篇文章,我要对其切词,大概思路 step1:去杂质(火星文什...

网友评论

      本文标题:HMM学习

      本文链接:https://www.haomeiwen.com/subject/pqodittx.html