The neuroscientific investigation of emotions is hindered by a lack of rapid and precise readouts of emotion states in model organisms. Dolensek et al. identified facial expressions as innate and sensitive reflections of the internal emotion state in mice (see the Perspective by Girard and Bellone). Mouse facial expressions evoked by diverse stimuli could be classified into emotionlike categories, similar to basic emotions in humans. Machine-learning algorithms categorized mouse facial expressions objectively and quantitatively at millisecond time scales. Intensity, value, and persistence of subjective emotion states could thus be decoded in individual animals. Combining facial expression analysis with two-photon calcium imaging allowed the identification of single neurons whose activity closely correlated with specific facial expressions in the insular cortex, a brain region implicated in affective experiences in humans.
Science , this issue p. ; see also p. 
Understanding the neurobiological underpinnings of emotion relies on objective readouts of the emotional state of an individual, which remains a major challenge especially in animal models. We found that mice exhibit stereotyped facial expressions in response to emotionally salient events, as well as upon targeted manipulations in emotion-relevant neuronal circuits. Facial expressions were classified into distinct categories using machine learning and reflected the changing intrinsic value of the same sensory stimulus encountered under different homeostatic or affective conditions. Facial expressions revealed emotion features such as intensity, valence, and persistence. Two-photon imaging uncovered insular cortical neuron activity that correlated with specific facial expressions and may encode distinct emotions. Facial expressions thus provide a means to infer emotion states and their neuronal correlates in mice.