Algorithm ‘sees’ when people’s eyes meet
Description

A new machine-learning tool detects eye contact during recorded face-to-face interactions as accurately as expert observers can. The tool could help researchers and clinicians measure eye contact efficiently and objectively.

Eye contact is key to social interactions, and a tendency to avoid it can be one of the earliest signs of autism.

Researchers and clinicians often measure a child’s use of eye contact by manually noting instances of it during recorded interactions with an adult — a method that is time-consuming and subjective. Computer models can detect eye contact in videos more quickly but less accurately.

The new computer model, described in December in Nature Communications, is the first to achieve an accuracy comparable to that of expert observers — a feat made possible by using ‘deep learning,’ a subset of machine learning.

The team initially trained an algorithm to perform tasks related to recognizing eye contact, such as detecting the position of a person’s head and the direction of his gaze, using three public datasets. By way of ‘transfer learning,’ this step improves the final algorithm’s accuracy, the researchers say.

The researchers then fine-tuned the algorithm to detect eye contact specifically, in more than 4 million frames from recorded interactions — about twice as many frames as were used to train previous algorithms. The interactions involved 121 children — 66 of them with autism — aged 18 months to 5 years, and an adult, who wore a pair of glasses with an embedded camera to record the child’s face.

Intelligent algorithm:

During each interaction, the child participated in two play-based assessments designed to elicit eye contact. In one test, for example, the examiner gives the child a wind-up toy only when the child makes eye contact.

A group of 10 trained raters watched videos of the interactions frame by frame and identified those in which the child appeared to be looking at the examiner’s eyes. The researchers trained the algorithm using 103 children’s videos, which were each scored by one rater. They then tested it on videos of the remaining 18 children, analyzed by multiple raters. The team judged a rater’s or the algorithm’s assessment of a frame to be correct when the majority of raters agreed.

Comparing the raters to the algorithm revealed that the latter is as accurate as the average rater. The algorithm also catches the vast majority of frames in which eye contact occurs.

The team also used the algorithm to replicate findings from two of their previous studies, in which trained raters had assessed eye contact in videos of autistic and non-autistic children and adolescents. The researchers were able to reproduce all of their original findings, statistical tests show, which suggests the computer model is suitable for research purposes.

“It actually provides the same quality of evidence for a scientific hypothesis as you got when you did it manually,” says James Rehg, professor of interactive computing at the Georgia Institute of Technology in Atlanta, who led the research.

The algorithm could be used in a variety of settings, such as measuring changes in eye contact in response to therapy, the researchers say. Code for the algorithm is available online.

The post Algorithm ‘sees’ when people’s eyes meet appeared first on Spectrum | Autism Research News.

Comments
Order by: 
Per page:
 
  • There are no comments yet
Related Feed Entries
In a landmark move for the global assistive technology community, the Ministry of Electronics & IT recently unveiled a comprehensive strategy to transform India from a text-heavy digital landscape into a voice-first ecosystem. Launched at the India AI Summit Expo 2026, this initiative is anchore…
3 days ago · From Assistive Technology Blog
By Sam Blanco, PhD, LBA, BCBA There’s a famous quote from W. Edwards Deming that says “Without data, you’re just another person with an opinion.” While Deming wasn’t a behavior analyst, this statement aligns closely with how BCBAs approach their work. Most BCBAs will report how much they love …
3 days ago · From Different Roads to Learning
Adidas has announced the launch of the Supernova Rise 3 Adaptive, its first performance running shoe specifically designed for athletes with disabilities. Developed over several years, the shoe was inspired by Chris Nikic—the first person with Down syndrome to complete an Ironman—who previously stru…
10.04.2026 · From Assistive Technology Blog
 Dear Friends, I never write for our blogs but I wanted to share this glimmer of hope. This weekend, an acquaintance of a friend of a friend asked me to view a French film called “No Filter Café” at a Socially Relevant Film Festival in NYC.  It’s a film in French about 5 young men…
31.03.2026 · From Different Roads to Learning
With the April 24, 2026, deadline for the updated ADA Title II regulations rapidly approaching, the landscape of digital inclusion is shifting from reactive accommodation to proactive accessibility. This mandate requires large public institutions to ensure that every facet of their digital presence—…
28.03.2026 · From Assistive Technology Blog
Rate
0 votes
Info
26.02.2021 (26.02.2021)
663 Views
0 Subscribers
Recommend
Tags