Viewing posts for the category Uncategorized

## Idea - Using compression to extract building blocks

So, initially when I was using CTW the problem was with VQ, in particular, independence of the quantization from the inference in the Markovian model. I solved that using HMMs so that the quantization is learned simultaneously with the structure.

## Idea - Can we use left-to-right HMMs?

Can we use left-to-right HMMs (probably with emissions from a mixture of Gaussians), and still use the number of states as a complexity measure? Or perhaps a combination of $$N_{states}$$ and $$N_{mixtures}$$? This might function as an indirect way of representing temporal structure.

## New regression on original dataset - Correction

Original post is here.

## New regression on original dataset

It works for the original dataset pretty well. Here's what comes out:

## New regression equation for LeapArticulator experiments

I have just discovered a better model for participants scores in the experiment. This used to be our prime model: prettyprint model <- lmer(score ~ nstates_amp_and_freq_n + (1 | id) + (1 |condition:phase) , data=all, REML=F)  I have discovered the following one outperforms this for both the original and the discrete datasets, and is much easier to interpret: prettyprint model <- lmer(score ~ nstates_amp_and_freq_n:phase_order:phase + (1|id), data=all, REML=F)  Of course, we are using MCMC samples, so the model declaration becomes: