Computers can Tell if You're Bored

Computers are able to read a person's body language to tell whether they are bored or interested in what they see on the screen, according to a new study led by body-language expert Dr Harry Witchel, Discipline Leader in Physiology at Brighton and Sussex Medical School (BSMS).

The research shows that by measuring a person's movements as they use a computer, it is possible to judge their level of interest by monitoring whether they display the tiny movements that people usually constantly exhibit, known as non-instrumental movements.

If someone is absorbed in what they are watching or doing - what Dr Witchel calls 'rapt engagement' - there is a decrease in these involuntary movements.

Dr Witchel said: "Our study showed that when someone is really highly engaged in what they're doing, they suppress these tiny involuntary movements. It's the same as when a small child, who is normally constantly on the go, stares gaping at cartoons on the television without moving a muscle.

The discovery could have a significant impact on the development of artificial intelligence. Future applications could include the creation of online tutoring programmes that adapt to a person's level of interest, in order to re-engage them if they are showing signs of boredom. It could even help in the development of companion robots, which would be better able to estimate a person's state of mind.

Also, for experienced designers such as movie directors or game makers, this technology could provide complementary moment-by-moment reading of whether the events on the screen are interesting. While viewers can be asked subjectively what they liked or disliked, a non-verbal technology would be able to detect emotions or mental states that people either forget or prefer not to mention.

"Being able to 'read' a person's interest in a computer program could bring real benefits to future digital learning, making it a much more two-way process," Dr Witchel said. "Further ahead it could help us create more empathetic companion robots, which may sound very 'sci fi' but are becoming a realistic possibility within our lifetimes."

In the study, 27 participants faced a range of three-minute stimuli on a computer, from fascinating games to tedious readings from EU banking regulation, while using a handheld trackball to minimise instrumental movements, such as moving the mouse. Their movements were quantified over the three minutes using video motion tracking. In two comparable reading tasks, the more engaging reading resulted in a significant reduction (42%) of non-instrumental movement.

The study team also included two of Dr Witchel's team, Carlos Santos and Dr James Ackah, media expert Carina Westling from the University of Sussex, and the clinical biomechanics group at Staffordshire University led by Professor Nachiappan Chockalingam.

BSMS is a partnership between the Universities of Sussex and Brighton together with NHS organisations throughout the south-east region.

Most Popular Now

Gait Assessed with Body-Worn Sensors may…

Body-worn sensors used at home and in clinic by people with mild Alzheimer's to assess walking could offer a cost-effective way to detect early disease and monitor progression of the...

Applications for the G4A Global Accelera…

Founded in 2013 in Berlin initially giving out grants to innovative healthcare apps, G4A Accelerator is now a global program dedicated to helping innovative health & care startups grow and...

Siemens Healthineers Fully on Track to M…

Siemens Healthineers AG has posted good business figures in the first quarter following its successful initial public offering on March 16, 2018. Year-over-year revenue was up four percent at EUR...

How to Build GDPR and HIPAA Compliant He…

The adoption of cloud and mobile technologies in healthcare is disrupting the services delivery models, and responsibilities and risks for involved actors. By their very nature, eHealth applications collect and...

Computers Equal Radiologists in Assessin…

Automated breast-density evaluation was just as accurate in predicting women's risk of breast cancer, found and not found by mammography, as subjective evaluation done by radiologists, in a study led...

The Big Ethical Questions for Artificial…

AI in healthcare is developing rapidly, with many applications currently in use or in development in the UK and worldwide. The Nuffield Council on Bioethics examines the current and potential...

Consultation: Transformation Health and …

The present report provides an analysis of the results of consultation activities carried out by the European Commission in preparation of a Communication on the Transformation of Health and Care...

Novartis Launches FocalView App, Providi…

Novartis announced the launch of its FocalView app, an ophthalmic digital research platform created with ResearchKit. FocalView aims to allow researchers to track disease progression by collecting real-time, self-reported data...

International Masters's in Medical Infor…

The Master of Science Program in Medical Informatics (MMI) at European Campus Rottal-Inn (ECRI)in Pfarrkirchen - a branch of the Deggendorf University of Applied Sciences (THD - Technische Hochschule Deggendorf)...

Data in the EU: Commission Steps Up Effo…

The European Commission is putting forward a set of measures to increase the availability of data in the EU, building on previous initiatives to boost the free flow of non-personal...

Philips Expands its Sleep & Respirat…

Royal Philips (NYSE: PHG, AEX: PHIA), a global leader in health technology, today announced that it has acquired NightBalance, a digital health scale-up company based in the Netherlands, that has...

A New Way to Watch Brain Activity in Act…

It's a neuroscientist's dream: being able to track the millions of interactions among brain cells in animals that move about freely, behaving as they would under natural circumstances. New technology...