Posted on 13 December 2012

*Part 11 in a series of videos recorded from ACM MIRUM 2012 in Nara, Japan.*

Moods and emotions are usually characterized using one of two models: a set of discrete mood classes, or a space of continuous values such as the valence-arousal model. However, numerical and statistical relationships between these two models -- discrete categories vs. continuous dimensions -- has not been extensively studied. Given a mood tag, it would be nice to understand where on the valence-arousal space the tag lies. And given a point in the valence-arousal space, it would be nice to identify which mood tags are most relevant.

Ju-Chiang Wang presents a computational model for mapping these two models together. Given a musical input, this unified model uses a set of latent states to connect the discrete categorical space (comprised of Bernoullis) to the valence-arousal space (comprised of Gaussians). Furthermore, this unified model not only finds points in the valence-arousal space for any mood tag, but it can also provide an estimate for how strong the mood is at that point.