Giter Club home page Giter Club logo

Comments (2)

jluttine avatar jluttine commented on June 12, 2024

Hi, thanks for your message! I have never used Hidden Markov Tree but it seems interesting. I may have misunderstood something, but I came up with three possible solutions:

1) Use fully factorizing mean-field approximation.

As you described, you could write something like this to construct a binary tree structure:

A = Dirichlet(np.ones(D), plates=(D,))  # DxD transition matrix
z = Categorical(np.ones(D)/D) # initial state
z_0 = Mixture(z, Categorical, A)
z_1 = Mixture(z, Categorical, A)
z_00 = Mixture(z_0, Categorical, A)
z_01 = Mixture(z_0, Categorical, A)
z_10 = Mixture(z_1, Categorical, A)
z_11 = Mixture(z_1, Categorical, A)
# and so on..

Equivalent constrution is possible with Gate node, doesn't make a difference in this case. You could also have, for instance, different transition matrices for left and right branches if you wanted. The problem with this construction is that the posterior approximation factorizes with respect to the variables. Thus, the approximation might be poor and convergence slow, as you pointed out. Also, you should think in which order you want to update your variables. But at least this is easy to implement and test on small problems.

2) Use separate chains for the paths to each leaf node

In order to reduce the amount of factorization, you could factorize in branch level. That is, each branch is a new categorical Markov chain. I'm not good at explaining but the idea would be that each left "turn" would trigger a new branch and each right "turn" belongs to the same chain as its parent. Thus, sequences 1, 11, 111, 1111, ... would be in the same chain. Also, sequences 0, 01, 011, 0111, ... in the same chain. Also 10, 101, 1011, ... would be a new chain branching at the second level. So, you would have as many chains as you have leaf nodes, that is, 2^(K-1) where K is the height/depth of your tree. The path to each leaf is one chain, and the path starts from the last left "turn" before the leaf. However, this is a bit complicated construction and the factorization is not symmetric so it is difficult at least for me to tell what kind of biases this kind of factorization causes. This shold be possible to implement but there are a few minor details that you need to know: how to access individual states of a chain (the question you marked with (**) ). At the moment, that is slightly complicated, because the indexing is used for (independent) plates only. Thus, you need to do something like this to access individual state variables of a chain:

Z = CategoricalMarkovChain(np.ones(D)/D, np.ones((D,D))/D, states=100)   # dummy chain
from bayespy.inference.vmp.nodes.categorical import CategoricalMoments
Z_tmp = Z._convert(CategoricalMoments)
Z_tmp[-1]  # access the last state
Z_tmp[2]   # access the third state

Yep, I know it's complicated, sorry. Should provide a simpler interface for that.

3) Write custom node for tree (or arbitrary graph) structure categorical variables

Quite often efficient inference and little factorization requires custom nodes that utilize some specific structure. For instance, GaussianMarkovChain implements a Kalman filter/smoother type algorithm and CategoricalMarkovChain implements the alpha-beta recursion. Thus, I guess the best solution is to write a custom node which implements some kind of belief propagation algorithm to perform exact inference for categorical Markov tree given some parameters. Then the challenge is in making the node as general and minimalistic as possible so it can be easily used as a building block in large models. However, I'm not too familiar with inference algorithms for Markov trees so I'm not sure how to implement it without studying a bit.

I hope I understood you correctly and I hope this helps. Please comment or ask further questions if you want.

from bayespy.

redst4r avatar redst4r commented on June 12, 2024

Hi,

thanks for your detailed response. I'm currently kind of busy with other stuff, but hopefully I'll have some time the next days to look at the ideas you proposed; looks promising!

I discovered some slight mistake in my exampe above: It should of course read

lastA = A[-1] # somehow get the last variable of chain A (**)
linkerNode = Gate(lastA, P)
B = CategoricalMarkovChain(linkerNode, P , states=N_B) 
C = CategoricalMarkovChain(linkerNode, P,  states=N_C) 

i.e. the last hidden state of A determines the initial distribution of the two following chains.

from bayespy.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.