ViterbiLearning Class 
Namespace: Accord.Statistics.Models.Markov.Learning
public class ViterbiLearning : BaseViterbiLearning<int[]>, IUnsupervisedLearning, IConvergenceLearning, IUnsupervisedLearning<HiddenMarkovModel, int[], int[]>
The ViterbiLearning type exposes the following members.
Name  Description  

ViterbiLearning 
Creates a new instance of the Viterbi learning algorithm.

Name  Description  

Batches 
Gets or sets on how many batches the learning data should be divided during learning.
Batches are used to estimate adequately the first models so they can better compute
the Viterbi paths for subsequent passes of the algorithm. Default is 1.
(Inherited from BaseViterbiLearningT.)  
CurrentIteration 
Gets the current iteration.
(Inherited from BaseViterbiLearningT.)  
HasConverged 
Gets a value indicating whether this instance has converged.
(Inherited from BaseViterbiLearningT.)  
Iterations  Obsolete.
Please use MaxIterations instead.
(Inherited from BaseViterbiLearningT.)  
MaxIterations 
Gets or sets the maximum number of iterations
performed by the learning algorithm.
(Inherited from BaseViterbiLearningT.)  
Model 
Gets the model being trained.
 
Token 
Gets or sets a cancellation token that can be used to
stop the learning algorithm while it is running.
(Inherited from BaseViterbiLearningT.)  
Tolerance 
Gets or sets the maximum change in the average loglikelihood
after an iteration of the algorithm used to detect convergence.
(Inherited from BaseViterbiLearningT.)  
UseLaplaceRule 
Gets or sets whether to use Laplace's rule
of succession to avoid zero probabilities.

Name  Description  

ComputeLogLikelihood 
Computes the loglikelihood for the current model for the given observations.
(Overrides BaseViterbiLearningTComputeLogLikelihood(T).)  
Equals  Determines whether the specified object is equal to the current object. (Inherited from Object.)  
Finalize  Allows an object to try to free resources and perform other cleanup operations before it is reclaimed by garbage collection. (Inherited from Object.)  
GetHashCode  Serves as the default hash function. (Inherited from Object.)  
GetType  Gets the Type of the current instance. (Inherited from Object.)  
Learn 
Learns a model that can map the given inputs to the desired outputs.
 
MemberwiseClone  Creates a shallow copy of the current Object. (Inherited from Object.)  
Run 
Runs the learning algorithm.
(Inherited from BaseViterbiLearningT.)  
RunEpoch 
Runs one single epoch (iteration) of the learning algorithm.
(Overrides BaseViterbiLearningTRunEpoch(T, Int32).)  
ToString  Returns a string that represents the current object. (Inherited from Object.) 
Name  Description  

HasMethod 
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.)  
IsEqual 
Compares two objects for equality, performing an elementwise
comparison if the elements are vectors or matrices.
(Defined by Matrix.)  
To(Type)  Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.)  
ToT  Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) 
The Viterbi learning algorithm is an alternate learning algorithms for hidden Markov models. It works by obtaining the Viterbi path for the set of training observation sequences and then computing the maximum likelihood estimates for the model parameters. Those operations are repeated iteratively until model convergence.
The Viterbi learning algorithm is also known as the Segmental KMeans algorithm.
Accord.Math.Random.Generator.Seed = 0; // We will try to create a Hidden Markov Model which // can detect if a given sequence starts with a zero // and has any number of ones after that. // int[][] sequences = new int[][] { new int[] { 0,1,1,1,1,0,1,1,1,1 }, new int[] { 0,1,1,1,0,1,1,1,1,1 }, new int[] { 0,1,1,1,1,1,1,1,1,1 }, new int[] { 0,1,1,1,1,1 }, new int[] { 0,1,1,1,1,1,1 }, new int[] { 0,1,1,1,1,1,1,1,1,1 }, new int[] { 0,1,1,1,1,1,1,1,1,1 }, }; // Creates a new Hidden Markov Model with 3 states for // an output alphabet of two characters (zero and one) // HiddenMarkovModel hmm = new HiddenMarkovModel(new Forward(3), 2); // Try to fit the model to the data until the difference in // the average loglikelihood changes only by as little as 0.0001 // var teacher = new ViterbiLearning(hmm) { Tolerance = 0.0001, Iterations = 0 }; // Learn the model teacher.Learn(sequences); // Calculate the probability that the given // sequences originated from the model // double l1; hmm.Decode(new int[] { 0, 1 }, out l1); // 0.5394 double l2; hmm.Decode(new int[] { 0, 1, 1, 1 }, out l2); // 0.4485 // Sequences which do not start with zero have much lesser probability. double l3; hmm.Decode(new int[] { 1, 1 }, out l3); // 0.0864 double l4; hmm.Decode(new int[] { 1, 0, 0, 0 }, out l4); // 0.0004 // Sequences which contains few errors have higher probability // than the ones which do not start with zero. This shows some // of the temporal elasticity and error tolerance of the HMMs. // double l5; hmm.Decode(new int[] { 0, 1, 0, 1, 1, 1, 1, 1, 1 }, out l5); // 0.0154 double l6; hmm.Decode(new int[] { 0, 1, 1, 1, 1, 1, 1, 0, 1 }, out l6); // 0.0154