HiddenMarkovModelTDistribution, TObservation Class |
Namespace: Accord.Statistics.Models.Markov
[SerializableAttribute] public class HiddenMarkovModel<TDistribution, TObservation> : LikelihoodTaggerBase<TObservation>, ICloneable where TDistribution : Object, IDistribution<TObservation>
The HiddenMarkovModelTDistribution, TObservation type exposes the following members.
Name | Description | |
---|---|---|
HiddenMarkovModelTDistribution, TObservation |
Constructs a new Hidden Markov Model.
| |
HiddenMarkovModelTDistribution, TObservation(ITopology) |
Constructs a new Hidden Markov Model.
| |
HiddenMarkovModelTDistribution, TObservation(Int32, FuncInt32, TDistribution) |
Constructs a new Hidden Markov Model with arbitrary-density state probabilities.
| |
HiddenMarkovModelTDistribution, TObservation(Int32, TDistribution) |
Constructs a new Hidden Markov Model with arbitrary-density state probabilities.
| |
HiddenMarkovModelTDistribution, TObservation(ITopology, FuncInt32, TDistribution) |
Constructs a new Hidden Markov Model with arbitrary-density state probabilities.
| |
HiddenMarkovModelTDistribution, TObservation(ITopology, TDistribution) |
Constructs a new Hidden Markov Model with arbitrary-density state probabilities.
| |
HiddenMarkovModelTDistribution, TObservation(ITopology, TDistribution) |
Constructs a new Hidden Markov Model with arbitrary-density state probabilities.
| |
HiddenMarkovModelTDistribution, TObservation(Double, TDistribution, Double, Boolean) |
Constructs a new Hidden Markov Model with arbitrary-density state probabilities.
| |
HiddenMarkovModelTDistribution, TObservation(Double, TDistribution, Double, Boolean) |
Constructs a new Hidden Markov Model with arbitrary-density state probabilities.
|
Name | Description | |
---|---|---|
Algorithm |
Gets or sets the algorithm
that should be used to compute solutions to this model's LogLikelihood(T[] input)
evaluation, Decide(T[] input) decoding and LogLikelihoods(T[] input)
posterior problems.
| |
Emissions |
Gets the Emission matrix (B) for this model.
| |
LogInitial |
Gets the log-initial probabilities log(pi) for this model.
| |
LogTransitions |
Gets the log-transition matrix log(A) for this model.
| |
NumberOfClasses |
Gets the number of classes expected and recognized by the classifier.
(Inherited from TaggerBaseTInput.) | |
NumberOfInputs |
Gets the number of inputs accepted by the model.
(Inherited from TransformBaseTInput, TOutput.) | |
NumberOfOutputs |
Gets the number of outputs generated by the model.
(Inherited from TransformBaseTInput, TOutput.) | |
NumberOfStates |
Gets the number of states of this model.
| |
States | Obsolete.
Gets the number of states of this model.
| |
Tag |
Gets or sets a user-defined tag associated with this model.
|
Name | Description | |
---|---|---|
Clone |
Creates a new object that is a copy of the current instance.
| |
Decide(TInput) |
Computes class-label decisions for the given input.
(Inherited from TaggerBaseTInput.) | |
Decide(TInput) |
Computes class-label decisions for the given input.
(Inherited from TaggerBaseTInput.) | |
Decide(TObservation, Int32) |
Computes class-label decisions for the given input.
(Overrides TaggerBaseTInputDecide(TInput, Int32).) | |
Decide(TObservation, Int32) |
Computes class-label decisions for the given input.
(Overrides TaggerBaseTInputDecide(TInput, Int32).) | |
Decode(TObservation) | Obsolete.
Calculates the most likely sequence of hidden states
that produced the given observation sequence.
| |
Decode(TObservation, Double) | Obsolete.
Calculates the most likely sequence of hidden states
that produced the given observation sequence.
| |
Equals | Determines whether the specified object is equal to the current object. (Inherited from Object.) | |
Evaluate(TObservation) | Obsolete.
Calculates the likelihood that this model has generated the given sequence.
| |
Evaluate(TObservation, Int32) | Obsolete.
Calculates the log-likelihood that this model has generated the
given observation sequence along the given state path.
| |
Finalize | Allows an object to try to free resources and perform other cleanup operations before it is reclaimed by garbage collection. (Inherited from Object.) | |
Generate(Int32) |
Generates a random vector of observations from the model.
| |
Generate(Int32, Int32, Double) |
Generates a random vector of observations from the model.
| |
GetHashCode | Serves as the default hash function. (Inherited from Object.) | |
GetType | Gets the Type of the current instance. (Inherited from Object.) | |
LogLikelihood(TInput) |
Predicts a the probability that the sequence vector
has been generated by this log-likelihood tagger.
(Inherited from LikelihoodTaggerBaseTInput.) | |
LogLikelihood(TInput) |
Predicts a the probability that the sequence vector
has been generated by this log-likelihood tagger.
(Inherited from LikelihoodTaggerBaseTInput.) | |
LogLikelihood(TInput, Int32) |
Predicts a the probability that the sequence vector
has been generated by this log-likelihood tagger.
(Inherited from LikelihoodTaggerBaseTInput.) | |
LogLikelihood(TInput, Int32) |
Predicts a the probability that the sequence vector
has been generated by this log-likelihood tagger.
(Inherited from LikelihoodTaggerBaseTInput.) | |
LogLikelihood(TObservation, Int32) |
Predicts a the probability that the sequence vector
has been generated by this log-likelihood tagger along
the given path of hidden states.
| |
LogLikelihood(TObservation, Double) |
Predicts a the probability that the sequence vector
has been generated by this log-likelihood tagger.
(Overrides LikelihoodTaggerBaseTInputLogLikelihood(TInput, Double).) | |
LogLikelihood(TObservation, Int32) |
Predicts a the probability that the sequence vector
has been generated by this log-likelihood tagger along
the given path of hidden states.
| |
LogLikelihood(TObservation, Int32, Double) |
Predicts a the probability that the sequence vector
has been generated by this log-likelihood tagger along
the given path of hidden states.
| |
LogLikelihood(TObservation, Int32, Double) |
Predicts a the probability that the sequence vector
has been generated by this log-likelihood tagger.
(Overrides LikelihoodTaggerBaseTInputLogLikelihood(TInput, Int32, Double).) | |
LogLikelihoods(TInput) |
Predicts a the log-likelihood for each of the observations in
the sequence vector assuming each of the possible states in the
tagger model.
(Inherited from LikelihoodTaggerBaseTInput.) | |
LogLikelihoods(TInput) |
Predicts a the log-likelihood for each of the observations in
the sequence vector assuming each of the possible states in the
tagger model.
(Inherited from LikelihoodTaggerBaseTInput.) | |
LogLikelihoods(TInput, Int32) |
Predicts a the log-likelihood for each of the observations in
the sequence vector assuming each of the possible states in the
tagger model.
(Inherited from LikelihoodTaggerBaseTInput.) | |
LogLikelihoods(TInput, Double) |
Predicts a the log-likelihood for each of the observations in
the sequence vector assuming each of the possible states in the
tagger model.
(Inherited from LikelihoodTaggerBaseTInput.) | |
LogLikelihoods(TInput, Int32) |
Predicts a the log-likelihood for each of the observations in
the sequence vector assuming each of the possible states in the
tagger model.
(Inherited from LikelihoodTaggerBaseTInput.) | |
LogLikelihoods(TObservation, Double) |
Predicts a the log-likelihood for each of the observations in
the sequence vector assuming each of the possible states in the
tagger model.
(Overrides LikelihoodTaggerBaseTInputLogLikelihoods(TInput, Double).) | |
LogLikelihoods(TInput, Int32, Double) |
Predicts a the log-likelihood for each of the observations in
the sequence vector assuming each of the possible states in the
tagger model.
(Inherited from LikelihoodTaggerBaseTInput.) | |
LogLikelihoods(TObservation, Int32, Double) |
Predicts a the log-likelihood for each of the observations in
the sequence vector assuming each of the possible states in the
tagger model.
(Overrides LikelihoodTaggerBaseTInputLogLikelihoods(TInput, Int32, Double).) | |
MemberwiseClone | Creates a shallow copy of the current Object. (Inherited from Object.) | |
Posterior(TObservation) | Obsolete.
Calculates the probability of each hidden state for each
observation in the observation vector.
| |
Posterior(TObservation, Int32) | Obsolete.
Calculates the probability of each hidden state for each observation
in the observation vector, and uses those probabilities to decode the
most likely sequence of states for each observation in the sequence
using the posterior decoding method. See remarks for details.
| |
Predict(TObservation) |
Predicts the next observation occurring after a given observation sequence.
| |
Predict(TObservation, Double) |
Predicts the next observation occurring after a given observation sequence.
| |
Predict(TObservation, Int32) |
Predicts the next observations occurring after a given observation sequence.
| |
Predict(TObservation, Int32, Double) |
Predicts the next observations occurring after a given observation sequence.
| |
PredictTMultivariate(TObservation, MultivariateMixtureTMultivariate) |
Predicts the next observation occurring after a given observation sequence.
| |
PredictTUnivariate(TObservation, MixtureTUnivariate) |
Predicts the next observation occurring after a given observation sequence.
| |
PredictTMultivariate(TObservation, Double, MultivariateMixtureTMultivariate) |
Predicts the next observation occurring after a given observation sequence.
| |
PredictTUnivariate(TObservation, Double, MixtureTUnivariate) |
Predicts the next observation occurring after a given observation sequence.
| |
Probabilities(TInput) |
Predicts a the probabilities for each of the observations in
the sequence vector assuming each of the possible states in the
tagger model.
(Inherited from LikelihoodTaggerBaseTInput.) | |
Probabilities(TInput) |
Predicts a the log-likelihood for each of the observations in
the sequence vector assuming each of the possible states in the
tagger model.
(Inherited from LikelihoodTaggerBaseTInput.) | |
Probabilities(TInput, Double) |
Predicts a the probabilities for each of the observations in
the sequence vector assuming each of the possible states in the
tagger model.
(Inherited from LikelihoodTaggerBaseTInput.) | |
Probabilities(TInput, Int32) |
Predicts a the log-likelihood for each of the observations in
the sequence vector assuming each of the possible states in the
tagger model.
(Inherited from LikelihoodTaggerBaseTInput.) | |
Probabilities(TInput, Double) |
Predicts a the log-likelihood for each of the observations in
the sequence vector assuming each of the possible states in the
tagger model.
(Inherited from LikelihoodTaggerBaseTInput.) | |
Probabilities(TInput, Int32) |
Predicts a the probabilities for each of the observations in
the sequence vector assuming each of the possible states in the
tagger model.
(Inherited from LikelihoodTaggerBaseTInput.) | |
Probabilities(TInput, Int32, Double) |
Predicts a the log-likelihood for each of the observations in
the sequence vector assuming each of the possible states in the
tagger model.
(Inherited from LikelihoodTaggerBaseTInput.) | |
Probabilities(TInput, Int32, Double) |
Predicts a the probabilities for each of the observations in
the sequence vector assuming each of the possible states in the
tagger model.
(Inherited from LikelihoodTaggerBaseTInput.) | |
Probability(TInput) |
Predicts a the probability that the sequence vector
has been generated by this log-likelihood tagger.
(Inherited from LikelihoodTaggerBaseTInput.) | |
Probability(TInput) |
Predicts a the probability that the sequence vector
has been generated by this log-likelihood tagger.
(Inherited from LikelihoodTaggerBaseTInput.) | |
Probability(TInput, Int32) |
Predicts a the probability that the sequence vector
has been generated by this log-likelihood tagger.
(Inherited from LikelihoodTaggerBaseTInput.) | |
Probability(TInput, Double) |
Predicts a the probability that the sequence vector
has been generated by this log-likelihood tagger.
(Inherited from LikelihoodTaggerBaseTInput.) | |
Probability(TInput, Int32) |
Predicts a the probability that the sequence vector
has been generated by this log-likelihood tagger.
(Inherited from LikelihoodTaggerBaseTInput.) | |
Probability(TInput, Int32, Double) |
Predicts a the probability that the sequence vector
has been generated by this log-likelihood tagger.
(Inherited from LikelihoodTaggerBaseTInput.) | |
Scores(TInput) |
Computes numerical scores measuring the association between
each of the given sequence vectors and each
possible class.
(Inherited from ScoreTaggerBaseTInput.) | |
Scores(TInput) |
Computes numerical scores measuring the association between
each of the given sequences vectors and each
possible class.
(Inherited from ScoreTaggerBaseTInput.) | |
Scores(TInput, Double) |
Computes numerical scores measuring the association between
each of the given sequences vectors and each
possible class.
(Inherited from LikelihoodTaggerBaseTInput.) | |
Scores(TInput, Double) |
Computes numerical scores measuring the association between
each of the given sequence vectors and each
possible class.
(Inherited from ScoreTaggerBaseTInput.) | |
Scores(TInput, Int32) |
Computes numerical scores measuring the association between
each of the given sequence vectors and each
possible class.
(Inherited from ScoreTaggerBaseTInput.) | |
Scores(TInput, Int32) |
Computes numerical scores measuring the association between
each of the given sequences vectors and each
possible class.
(Inherited from ScoreTaggerBaseTInput.) | |
Scores(TInput, Int32, Double) |
Computes numerical scores measuring the association between
each of the given sequences vectors and each
possible class.
(Inherited from LikelihoodTaggerBaseTInput.) | |
Scores(TInput, Int32, Double) |
Computes numerical scores measuring the association between
each of the given sequence vectors and each
possible class.
(Inherited from ScoreTaggerBaseTInput.) | |
ToString | Returns a string that represents the current object. (Inherited from Object.) | |
Transform(TInput) |
Applies the transformation to an input, producing an associated output.
(Inherited from TaggerBaseTInput.) | |
Transform(TInput) |
Applies the transformation to a set of input vectors,
producing an associated set of output vectors.
(Inherited from TransformBaseTInput, TOutput.) | |
Transform(TInput, Double) |
Applies the transformation to an input, producing an associated output.
(Inherited from LikelihoodTaggerBaseTInput.) | |
Transform(TInput, TOutput) |
Applies the transformation to an input, producing an associated output.
(Inherited from TransformBaseTInput, TOutput.) |
Name | Description | |
---|---|---|
HasMethod |
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.) | |
IsEqual |
Compares two objects for equality, performing an elementwise
comparison if the elements are vectors or matrices.
(Defined by Matrix.) | |
To(Type) | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) | |
ToT | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) |
Hidden Markov Models (HMM) are stochastic methods to model temporal and sequence data. They are especially known for their application in temporal pattern recognition such as speech, handwriting, gesture recognition, part-of-speech tagging, musical score following, partial discharges and bioinformatics.
This page refers to the arbitrary-density (continuous emission distributions) version of the model. For discrete distributions, please see HiddenMarkovModel.
Dynamical systems of discrete nature assumed to be governed by a Markov chain emits a sequence of observable outputs. Under the Markov assumption, it is also assumed that the latest output depends only on the current state of the system. Such states are often not known from the observer when only the output values are observable.
Hidden Markov Models attempt to model such systems and allow, among other things,
The “hidden” in Hidden Markov Models comes from the fact that the observer does not know in which state the system may be in, but has only a probabilistic insight on where it should be.
The arbitrary-density Hidden Markov Model uses any probability density function (such as GaussianMixture Model) for computing the state probability. In other words, in a continuous HMM the matrix of emission probabilities B is replaced by an array of either discrete or continuous probability density functions.
If a general discrete distribution is used as the underlying probability density function, the model becomes equivalent to the discrete Hidden Markov Model.
For a more thorough explanation on some fundamentals on how Hidden Markov Models work, please see the HiddenMarkovModel documentation page. To learn a Markov model, you can find a list of both supervised and unsupervised learning algorithms in the Accord.Statistics.Models.Markov.Learning namespace.
References:
The example below reproduces the same example given in the Wikipedia entry for the Viterbi algorithm (http://en.wikipedia.org/wiki/Viterbi_algorithm). As an arbitrary density model, one can use it with any available probability distributions, including with a discrete probability. In the following example, the generic model is used with a GeneralDiscreteDistribution to reproduce the same example given in HiddenMarkovModelTDistribution, TObservation. Below, the model's parameters are initialized manually. However, it is possible to learn those automatically using BaumWelchLearningTDistribution, TObservation.
// Create the transition matrix A double[,] transitions = { { 0.7, 0.3 }, { 0.4, 0.6 } }; // Create the vector of emission densities B GeneralDiscreteDistribution[] emissions = { new GeneralDiscreteDistribution(0.1, 0.4, 0.5), new GeneralDiscreteDistribution(0.6, 0.3, 0.1) }; // Create the initial probabilities pi double[] initial = { 0.6, 0.4 }; // Create a new hidden Markov model with discrete probabilities var hmm = new HiddenMarkovModel<GeneralDiscreteDistribution, double>(transitions, emissions, initial); // After that, one could, for example, query the probability // of a sequence occurring. We will consider the sequence double[] sequence = new double[] { 0, 1, 2 }; // And now we will evaluate its likelihood double logLikelihood = hmm.LogLikelihood(sequence); // At this point, the log-likelihood of the sequence // occurring within the model is -3.3928721329161653. // We can also get the Viterbi path of the sequence int[] path = hmm.Decide(sequence); // Or also its Viterbi likelihood alongside the path double viterbi = hmm.LogLikelihood(sequence, ref path); // At this point, the state path will be 1-0-0 and the // log-likelihood will be -4.3095199438871337
Examples on how to learn hidden Markov models can be found on the documentation pages of the respective learning algorithms: BaumWelchLearningTDistribution, TObservation, ViterbiLearningTDistribution, TObservation, MaximumLikelihoodLearningTDistribution, TObservation. The simplest of such examples can be seen below:
// Continuous Markov Models can operate using any // probability distribution, including discrete ones. // In the following example, we will try to create a // Continuous Hidden Markov Model using a discrete // distribution to detect if a given sequence starts // with a zero and has any number of ones after that. int[][] sequences = new double[][] { new double[] { 0,1,1,1,1,0,1,1,1,1 }, new double[] { 0,1,1,1,0,1,1,1,1,1 }, new double[] { 0,1,1,1,1,1,1,1,1,1 }, new double[] { 0,1,1,1,1,1 }, new double[] { 0,1,1,1,1,1,1 }, new double[] { 0,1,1,1,1,1,1,1,1,1 }, new double[] { 0,1,1,1,1,1,1,1,1,1 }, }.ToInt32(); // Create a new Hidden Markov Model with 3 states and // a generic discrete distribution with two symbols var hmm = HiddenMarkovModel.CreateDiscrete(3, 2); // Try to fit the model to the data until the difference in // the average log-likelihood changes only by as little as 0.0001 var teacher = new BaumWelchLearning<GeneralDiscreteDistribution, int>(hmm) { Tolerance = 0.0001, Iterations = 0 }; // Learn the model teacher.Learn(sequences); double ll = Math.Exp(teacher.LogLikelihood); // Calculate the probability that the given // sequences originated from the model double l1 = hmm.Probability(new int[] { 0, 1 }); // 0.999 double l2 = hmm.Probability(new int[] { 0, 1, 1, 1 }); // 0.916 // Sequences which do not start with zero have much lesser probability. double l3 = hmm.Probability(new int[] { 1, 1 }); // 0.000 double l4 = hmm.Probability(new int[] { 1, 0, 0, 0 }); // 0.000 // Sequences which contains few errors have higher probability // than the ones which do not start with zero. This shows some // of the temporal elasticity and error tolerance of the HMMs. double l5 = hmm.Probability(new int[] { 0, 1, 0, 1, 1, 1, 1, 1, 1 }); // 0.034 double l6 = hmm.Probability(new int[] { 0, 1, 1, 1, 1, 1, 1, 0, 1 }); // 0.034
Markov models can also be trained without having, in fact, "hidden" parts. The following example shows how hidden Markov models trained using Maximum Likelihood Learning can be used in the context of fraud analysis, in which we actually know in advance the class labels for each state in the sequences we are trying to learn:
// Ensure results are reproducible Accord.Math.Random.Generator.Seed = 0; // Let's say we have the following data about credit card transactions, // where the data is organized in order of transaction, per credit card // holder. Everytime the "Time" column starts at zero, denotes that the // sequence of observations follow will correspond to transactions of the // same person: double[,] data = { // "Time", "V1", "V2", "V3", "V4", "V5", "Amount", "Fraud" { 0, 0.521, 0.124, 0.622, 15.2, 25.6, 2.70, 0 }, // first person, ok { 1, 0.121, 0.124, 0.822, 12.2, 25.6, 42.0, 0 }, // first person, ok { 0, 0.551, 0.124, 0.422, 17.5, 25.6, 20.0, 0 }, // second person, ok { 1, 0.136, 0.154, 0.322, 15.3, 25.6, 50.0, 0 }, // second person, ok { 2, 0.721, 0.240, 0.422, 12.2, 25.6, 100.0, 1 }, // second person, fraud! { 3, 0.222, 0.126, 0.722, 18.1, 25.8, 10.0, 0 }, // second person, ok }; // Transform the above data into a jagged matrix double[][][] input; int[][] states; transform(data, out input, out states); // Determine here the number of dimensions in the observations (in this case, 6) int observationDimensions = 6; // 6 columns: "V1", "V2", "V3", "V4", "V5", "Amount" // Create some prior distributions to help initialize our parameters var priorC = new WishartDistribution(dimension: observationDimensions, degreesOfFreedom: 10); // this 10 is just some random number, you might have to tune as if it was a hyperparameter var priorM = new MultivariateNormalDistribution(dimension: observationDimensions); // Configure the learning algorithms to train the sequence classifier var teacher = new MaximumLikelihoodLearning<MultivariateNormalDistribution, double[]>() { // Their emissions will be multivariate Normal distributions initialized using the prior distributions Emissions = (j) => new MultivariateNormalDistribution(mean: priorM.Generate(), covariance: priorC.Generate()), // We will prevent our covariance matrices from becoming degenerate by adding a small // regularization value to their diagonal until they become positive-definite again: FittingOptions = new NormalOptions() { Regularization = 1e-6 }, }; // Use the teacher to learn a new HMM var hmm = teacher.Learn(input, states); // Use the HMM to predict whether the transations were fradulent or not: int[] firstPerson = hmm.Decide(input[0]); // predict the first person, output should be: 0, 0 int[] secondPerson = hmm.Decide(input[1]); // predict the second person, output should be: 0, 0, 1, 0
Where the transform function is defined as:
private static void transform(double[,] data, out double[][][] input, out int[][] states) { var sequences = new List<double[][]>(); var classLabels = new List<int[]>(); List<double[]> currentSequence = null; List<int> currentLabels = null; for (int i = 0; i < data.Rows(); i++) { // Check if the first column contains a zero, this would be an indication // that a new sequence (for a different person) is beginning: if (data[i, 0] == 0) { // Yes, this is a new sequence. Check if we were building // a sequence before, and if yes, save it to the list: if (currentSequence != null) { // Save the sequence we had so far sequences.Add(currentSequence.ToArray()); classLabels.Add(currentLabels.ToArray()); currentSequence = null; currentLabels = null; } // We will be starting a new sequence currentSequence = new List<double[]>(); currentLabels = new List<int>(); } double[] features = data.GetRow(i).Get(1, 7); // Get values in columns from 1 (inclusive) to 7 (exclusive), meaning "V1", "V2", "V3", "V4", "V5", and "Amount" int classLabel = (int)data[i, 7]; // The seventh index corresponds to the class label column ("Class") // Save this information: currentSequence.Add(features); currentLabels.Add(classLabel); } // Check if there are any sequences and labels that we haven't saved yet: if (currentSequence != null) { // Yes there are: save them sequences.Add(currentSequence.ToArray()); classLabels.Add(currentLabels.ToArray()); } input = sequences.ToArray(); states = classLabels.ToArray(); }
Hidden Markov Models can also be used to predict the next observation in a sequence. This can be done by inspecting the forward matrix of probabilities for the sequence and checking which would be the most likely state after the current one. Then, it returns the most likely value (the mode) for the distribution associated with that state. This limits the applicability of this model to only very short-term predictions (i.e. most likely, only the most immediate next observation).
// We will try to create a Hidden Markov Model which // can recognize (and predict) the following sequences: double[][] sequences = { new double[] { 1, 3, 5, 7, 9, 11, 13 }, new double[] { 1, 3, 5, 7, 9, 11 }, new double[] { 1, 3, 5, 7, 9, 11, 13 }, new double[] { 1, 3, 3, 7, 7, 9, 11, 11, 13, 13 }, new double[] { 1, 3, 7, 9, 11, 13 }, }; // Create a Baum-Welch HMM algorithm: var teacher = new BaumWelchLearning<NormalDistribution, double, NormalOptions>() { // Let's creates a left-to-right (forward) // Hidden Markov Model with 7 hidden states Topology = new Forward(7), FittingOptions = new NormalOptions() { Regularization = 1e-8 }, // We'll try to fit the model to the data until the difference in // the average log-likelihood changes only by as little as 0.0001 Tolerance = 0.0001, Iterations = 0 // do not impose a limit on the number of iterations }; // Use the algorithm to learn a new Markov model: HiddenMarkovModel<NormalDistribution, double> hmm = teacher.Learn(sequences); // Now, we will try to predict the next 1 observation in a base symbol sequence double[] prediction = hmm.Predict(observations: new double[] { 1, 3, 5, 7, 9 }, next: 1); // At this point, prediction should be around double[] { 11.909090909090905 } double nextObservation = prediction[0]; // should be comparatively near 11.