Facing a decision: Researchers analyzed motion energy across different mouse facial regions during a foraging task.
Davide Reato & Fanny Cazettes

Facial movements telegraph cognition in mice

If you give a mouse a decision, its thought process may show on its face.

Bad news for mouse poker players: Their facial movements offer “tells” about decision-making variables that the animals track without always acting on them, according to a study published today in Nature Neuroscience.

The findings indicate that “cognition is embodied in some surprising ways,” says study investigator Zachary Mainen, a researcher at the Champalimaud Center for the Unknown. And this motor activity holds promise as a noninvasive bellwether of cognitive patterns.

The study builds on mounting evidence that mouse facial expressions are not solely the result of a task’s motor demands and provides a “very clear” illustration of how this movement reflects cognitive processes, says Marieke Schölvinck, a researcher at the Ernst Strüngmann Institute for Neuroscience, who was not involved with the work.

For years, mouse facial movements have mostly served as a way for researchers to gauge an animal’s pain levels. Now, however, machine-learning technology has made it possible to analyze this fine motor behavior in greater detail, says Schölvinck, who has investigated how facial expressions reflect inner states in mice and macaques.

Evidence that mouse facial expressions correspond to emotional states inspired the new analysis, according to Fanny Cazettes, who conducted the experiments as a postdoctoral researcher in Mainen’s lab. She says she wondered what other ways the “internal, private thoughts of animals” might manifest on their faces.

T

wo variables shape most mouse decisions over different foraging sites, the team found: the number of failures at a site (unrewarded licks from a source of sugar water) and the site’s perceived value (the difference between reward and failure).

These decision variables underlie the three strategies mice use to choose between two foraging sites. Mice using a stimulus-based strategy make decisions according to a site’s value, whereas animals relying on an inference-based strategy judge the number of failures at their current foraging spot. Inference-based strategies are either impulsive or persistent, depending on an animal’s innate inclination to switch sites.

Even when mice employ one strategy, they still monitor other decision variables and can switch tactics, Cazettes reported in a 2023 paper. The secondary motor cortex (M2) orchestrates this tracking, according to electrophysiological recordings and optogenetic inactivation of cells in the region.

This internal variable tracking translates to differences in facial movements too subtle to detect by the human eye, the new study indicates. These expressions correlate with multiple latent decision variables, even if those variables are not part of a mouse’s current foraging strategy, according to a frame-by-frame video analysis that used a deep-learning algorithm called Facemap to predict neural activity from facial movements.

Activation of the M2 appears to drive motor activity and not the other way around, the researchers found. A model based on electrophysiological readouts was able to decode decision variables earlier than a model based on facial movement. Optogenetic inactivation of M2 produced immediate motor changes that hampered the facial movement model’s ability to predict decision variables.

The experiments demonstrate that M2 involvement is more than a byproduct of motions taken to execute a decision, says Alfonso Renart, who supervised the study alongside Mainen. Recent findings from other researchers also support a role for M2 activation in behavioral flexibility rather than motor task execution.

Read my licks: Facial movements corresponded to variables involved in decision-making, not just in executing a choice.

To Mainen, the connection between cognition and the body suggests that controlling for all possible motor activity may be “a fool’s errand.” He recalls a conversation with nonhuman-primate researchers hoping to disentangle body movement from other signals in electrophysiological recordings. “How do you know the monkey isn’t clenching its butt cheeks when you’re doing that?” he says, illustrating these worries. “And that actually your supposedly prefrontal correlate … isn’t just from the monkey’s hidden muscle movements?”

It’s possible that all brain signaling ultimately corresponds to some motor activity, Mainen says, adding that scientists “can’t just keep saying everything that appears anywhere in the body has to be excluded from being a signal.”

F

uture research may flag sites besides the M2 that connect decision-making and facial movements, says Carsen Stringer, a group leader at the Howard Hughes Medical Institute’s Janelia Research Campus who co-led the team that developed Facemap but was not involved in the new study.

The facial analysis model Cazettes and her collaborators developed was still able to decode an animal’s decision variables even after M2 inactivation, though not as well, Stringer notes. “You want to look beyond just cortex when thinking about the origins of some of this neural activity,” she says. In her own lab, she and her team have documented facial-movement-linked activation across “basically every brain region” they have studied in mice.

Meanwhile, a paper from the International Brain Laboratory published earlier this month demonstrates how decision-making sparks activity throughout the mouse brain; both Mainen and Cazettes were part of the consortium that mapped brain-wide activation associated with a visual choice task.

Now the head of her own lab at Aix-Marseille University and the French National Centre for Scientific Research, Cazettes is expanding her research to studies that incorporate virtual reality. As facial movement analysis advances, she and her collaborators say they worry about how similar technology might be used for human surveillance. “Our most private thoughts could now become an open book,” she says.

Schölvinck says she plans to expand her research on macaques to human participants but believes such privacy concerns are years away. Still, she adds,“there’s a lot more from faces that you can read with those methods than people might have previously thought.”

Sign up for our weekly newsletter.

Catch up on what you missed from our recent coverage, and get breaking news alerts.

privacy consent banner

Privacy Preference

We use cookies to provide you with the best online experience. By clicking “Accept All,” you help us understand how our site is used and enhance its performance. You can change your choice at any time. To learn more, please visit our Privacy Policy.