Virtual mouse helps scientists track behavior
Description

An animated mouse reenacts common behavioral experiments and can be used to train algorithms that automatically track lab animals’ movements. The approach could help researchers analyze mouse behavior more efficiently.

Researchers typically use video cameras to capture the behavior and movements of mice that model certain autism traits. They can then use machine-learning algorithms to automatically label and track specific body parts, such as a mouse’s fingers or spine.

Training the algorithms can be laborious, however. To train a widely used tracking tool called DeepLabCut, for example, scientists need to manually label specific points on an animal in about 100 to 200 still frames. Pooling data from multiple videos presents additional challenges and may make the training process even longer.

“If the lighting [is] different, or the camera angle, it might throw off some of the machine-learning tracking systems,” says Timothy Murphy, professor of psychiatry at the University of British Columbia in Vancouver, Canada, who led the new work. “We wanted to have a more generic scene.”

To make it easier to combine data from multiple videos, Murphy and his colleagues created an animated model of a mouse and used it to simulate videos of real mice. The animated videos could speed up the process of training algorithms, Murphy says, because researchers only have to label features of interest on the virtual mouse once.

The virtual mouse, replete with skin, fur and whiskers, reflects computerized tomography scans of three female mice. The team used artificial intelligence tools to make the animated mouse appear realistic and added slight variations to the videos — tweaking a mouse’s movements or altering the lighting, for instance — to diversify the training data.

Training simulation:

To test the approach, the team used an artificial video of an animated mouse running on a wheel. They trained a DeepLabCut algorithm to track 28 markers along the animated animal’s spine and left limbs. They then used the algorithm to analyze videos of real mice and assessed its performance. To compare the artificially trained algorithm with more traditional methods, the researchers also hand-labeled frames from real footage of a mouse.

The algorithm trained using synthetic videos performs about as well as manual approaches: Its accuracy is comparable to that of an algorithm trained using 200 hand-labeled frames, the researchers reported in April in Nature Methods. The artificially trained algorithm also produces few errors — the positions of markers are, on average, 6.7 pixels off compared with their hand-labeled locations.

The researchers used the artificial videos to track the position of specific body parts in 3D. Traditional approaches typically require several cameras set up at different angles to do the same.

With the virtual mouse, the researchers could estimate the 3D positions of body parts using only one camera view. And because the virtual mouse model inherently captures 3D information, researchers can readily translate the 2D position of a body marker to 3D.

The virtual mouse could help researchers train a variety of algorithms that analyze behavior, the researchers say. Creating an animated video takes time, but the approach may be particularly useful for researchers who want to track numerous body parts, or when several groups are using the same experimental setup.

Murphy and his colleagues say they plan to use the approach to look for subtle behavioral differences in mice that recapitulate autism traits.

The post Virtual mouse helps scientists track behavior appeared first on Spectrum | Autism Research News.

Comments
Order by: 
Per page:
 
  • There are no comments yet
Related Feed Entries
In a landmark move for the global assistive technology community, the Ministry of Electronics & IT recently unveiled a comprehensive strategy to transform India from a text-heavy digital landscape into a voice-first ecosystem. Launched at the India AI Summit Expo 2026, this initiative is anchore…
6 days ago · From Assistive Technology Blog
By Sam Blanco, PhD, LBA, BCBA There’s a famous quote from W. Edwards Deming that says “Without data, you’re just another person with an opinion.” While Deming wasn’t a behavior analyst, this statement aligns closely with how BCBAs approach their work. Most BCBAs will report how much they love …
6 days ago · From Different Roads to Learning
Adidas has announced the launch of the Supernova Rise 3 Adaptive, its first performance running shoe specifically designed for athletes with disabilities. Developed over several years, the shoe was inspired by Chris Nikic—the first person with Down syndrome to complete an Ironman—who previously stru…
10.04.2026 · From Assistive Technology Blog
 Dear Friends, I never write for our blogs but I wanted to share this glimmer of hope. This weekend, an acquaintance of a friend of a friend asked me to view a French film called “No Filter Café” at a Socially Relevant Film Festival in NYC.  It’s a film in French about 5 young men…
31.03.2026 · From Different Roads to Learning
With the April 24, 2026, deadline for the updated ADA Title II regulations rapidly approaching, the landscape of digital inclusion is shifting from reactive accommodation to proactive accessibility. This mandate requires large public institutions to ensure that every facet of their digital presence—…
28.03.2026 · From Assistive Technology Blog
Rate
0 votes
Info
28.05.2021 (28.05.2021)
522 Views
0 Subscribers
Recommend
Tags