Are today’s children, who grew up with mobile technology from birth, worse at reading emotions and picking up cues from people’s faces than children who didn’t grow up with tablets and smartphones? A new UCLA psychology study suggests today’s kids are all right.
Are today’s children, who grew up with mobile technology from birth, worse at reading emotions and picking up cues from people’s faces than children who didn’t grow up with tablets and smartphones? A new UCLA psychology study suggests today’s kids are all right.
Infancy and early childhood are critical developmental phases during which children learn to interpret important non-verbal cues such as facial expressions, tone of voice and gestures. Traditionally, this happens through direct face-to-face communication. But with the ubiquitous use of tablets and other devices today — among toddlers, as well as their caregivers — the psychologists wanted to know: Have younger children missed the opportunity to understand these cues?
The study tested the ability of more than 50 sixth graders in 2017, and more than 50 sixth graders in 2012 — both male and female, from the same Southern California public school — to correctly identify emotions in photographs and videos. Most children from the sixth-grade class of 2012 were born in 2001, while the first iPhone came out in 2007, for example, and the first iPad in 2010 — a time when the sixth graders from the 2017 class were infants and toddlers.
The psychologists found that the 2017 sixth graders scored 40% higher than the 2012 class at correctly identifying emotions in photographs and made significantly fewer errors than the 2012 students. In addition, the 2017 students were better at identifying the emotions in a series of videos, but only slightly better, a difference the researchers said is not statistically significant. The psychologists did not look at face-to-face communication.
Read more at University of California - Los Angeles
Image: Sixth graders in the study looked at photos, including this one, and were asked to identify whether the person was happy, sad, angry or fearful. (Credit: Stephen Nowicki)