Primary page content

The Happy Talk Study

“Happy Talk” is the first study in the InfantLab’s new eyetracking lab. We are looking at what makes Infant Directed Speech so attractive to babies.

What is Infant Directed Speech?

No matter the language or region of the world, people always tend to talk to babies in a slow and high-pitched tone, many times with a smile on our faces. It is second nature for us to change the way that we speak and the pronunciation our words when we are talking to babies rather than when we are talking to adults. Just go ahead and try talking to your baby the way you would normally, it’s hard isn’t it? This type of heightened toned and slowed speech is referred to as infant directed speech (IDS) as opposed to regular adult directed speech (ADS).

Past Research

A past study has shown that this is a consistent change mothers make to their voice when talking to their baby, no matter the language being spoken. This shows that IDS consistent across cultures and therefore must have some significant meaning to babies. The researchers believe the change in tone is a ‘cue mothers implicitly use to support babies’ language learning’. (Source).  Past research has also shown that one month old and newborns both preferred the infant directed speech to the adult directed speech. The researchers state that this is most likely because of the exaggeration of the prosody, the stress and intonation in speech, of words (Source). This study looked at the preferences of babies under one month old, to see whether they favoured a character speaking in IDS or one speaking in ADS. The results of this study showed that newborn babies’ preference for the characteristics of IDS is present from birth, the researchers however do also consider the possibility of parents talking to their babies before birth playing a role in these results.

“Language but also emotion”

Another study looked at the importance of infant directed speech in babies’ language acquisition, the learning of new segments of language. This study specifically showed that the pitch and tone characteristics present in infant directed speech affect the way babies are able to tell the difference between different vowel sounds. The researchers concluded with this study that the exaggeration of pitch in IDS helps with vowel discrimination however the high pitch of IDS does not, instead aiding in emotional communication and gaining the baby’s attention. This shows how important IDS can be in babies learning of language as well as babies understanding of emotion (Source).

 

Our Current Study

A mother and baby taking part in an eyetracking experiment

A mother and baby taking part in the Happy Talk study

The baby will sit on their parents lap while two cartoons are placed on a screen, one that talks with IDS and one with ADS. Throughout the study we will track the baby’s eye movement to see which cartoon the baby favors, the IDS one or the ADS one. To accurately discern which the baby truly likes more we are using an eye-tracking technology to follow which cartoon the baby looks at longest. We will also be recording the study to see the overall reaction of the baby to both types of speech and see which they tend to prefer based on their body language and facial expressions.

Here is video from when the Evening Standard recently came to our lab to discuss this study and other research: https://www.standard.co.uk/news/health/the-latest-brainwave-octopus-hat-to-monitor-children-s-development-a4096381.html

Information to sign up

We are currently looking for more wonderful babies to participate in this study here at Goldsmiths InfantLab. If you and your baby are interested in participating please contact us at infantlab@gold.ac.uk or call 0207 717 2983 or register on our website https://sites.gold.ac.uk/infantlab/take-part/

 

Article by Megan Loftus, 17 April 2019

 

Natasa goes to Japan!

In mid-April 2019 Natasa Ganea, a PhD student in Goldsmiths InfantLab, will be going to Chuo University in Tokyo, Japan for 6 weeks.  Thanks to funding from the Economic and Social Research Council, she will have the chance to run a rare cross-cultural study on what babies know about gender.

About the study

Natasa wants to find out how babies perceive the gender of faces and voices. Infants will watch videos showing a woman and a man speaking in synchrony, side-by-side. As the infants watch the videos, they will hear a single female or male voice in the background. Natasa wants to know if they will match the gender of the voice to the correct face. She also wants to know if this is harder for faces of a different culture.

A Japanese woman and man with a speech bubble

Will babies match the gender of the voice they hear to one of two faces on the screen?

Some infants will watch pairs of Caucasian-British speakers reading in English. Other infants will watch pairs of Asian-Japanese speakers reading in Japanese. Natasa predicts the task will be easier with faces from their native Japanese culture. When Natasa returns to the UK she will run the same study with British babies. She expects the pattern to reverse.

This is the final study of Natasa’s PhD which is all about how babies combine information from different senses to understand the world.

This is also the second collaboration between Chuo University and Goldsmiths InfantLab. Between 2016 to 2018, Jiale Yang a researcher from Chuo University visited Goldsmith’s InfantLab. Jiale and other members of the Yamaguchi Lab will help Natasa conduct the study.

By Catherine Zhao, 9 April 2019

The Baby Talk Study

All over the world, when people talk to babies we all use a very similar tone of voice. We raise our pitch and change our intonation in ways that sounds happier and more excited. Researchers have found infant directed speech (IDS) has a distinctive change in vocal timbre that is similar across many languages. This appears to help babies learn language because they make it easier to separate individual words and make key vowel sounds more distinctive. Researchers agree that emotional content and intonation both matter when speaking to babies but the relative importance of these elements is not known.

The aim of the present study is to learn more about baby talk. We are working with JJ Aucouturier and his team from IRCAM, Paris who are specialists in emotion in music and speech signals in adults. They have developed audio filters that can manipulate the ‘smileyness’ and other emotional aspects of a speech signal. We will use their software to manipulate recorded speech and see what version babies prefer.

Click here to learn more about infant speech research.

Sounding Shapes

Many objects in the environment produce sounds as they move. This has been found to help infants learn the trajectory of objects even when the objects briefly disappear from sight, as is the case when an object moves behind an occluder. For example, when infants both see and hear a ball moving sideways behind a box they anticipate more often where the ball will reappear, than when the ball is moving but the sound appears to be coming from the static box. To what extent infants encode better objects that produce sounds as they move, or they simply track better the location of such objects, is a question that we are trying to answer in this study.

To do this, we are showing infants different geometrical shapes that move sideways behind an occluder while accompanied by different sounds, and record the interval of time that infants spend watching the animations. We are currently inviting 4 to 5 month old infants to take part in this study.

What Material Is It Made From?

As adults, we can recognize both what an object is and what material it is made of. In order to identify the material category to which an object belongs (e.g., metal, ceramic, fur, etc.), we have to obtain information about the physical properties of the object via multiple sensory modalities (e.g., vision, hearing, touch), and integrate this information in the brain. There is a large amount literature describing the development of infants’ ability to recognize objects (e.g. faces), but far less is known about the development of material recognition.

With this study, we would like to find out how infants process information about material via vision and touch. We will show your child several small metal cylinders, some of which have been covered with faux fur (see picture), and we will allow him/her to either manually explore or look at the cylinders. How your child explores and watches the cylinders will be videotaped. These recordings will allow us to investigate the extent to which babies distinguish different kinds of materials. We are currently inviting 4 to 8 month old infants to participate in this project.

Tickling Lights

If we feel a touch on one of our hands, it draws our attention to that hand. When we record adults’ brain activity whilst we flash a light on either the hand that they are paying attention to, or the hand that they are not, we see a difference in how the brain processes that flash of light. Paying attention to something causes a much larger brain response than if our attention is elsewhere. Little is known about this process in infants. By recording your child’s brain activity (using a special cap), we are able to focus in on the parts of the brain that process vision and touch and explore how infants process this kind of information.

In our study, we want to find out how young infants’ brain responds to light vibrations (that feel like slight tickles) on their hands which signal the location of a flash of light presented moments later (placed on the back of their hands via scratch mittens). We are currently inviting 7 and 10 month old infants to participate in this project.

Where Is It Going?

As adults, we are able to combine information from vision and audition to discriminate the trajectories of moving objects and to decide towards which ones it is important to direct our attention. However, we do not know if these abilities are present already in the first months of life and how they develop during infancy. With this study, we would like to find out how infants process information about the trajectories of stimuli moving in the environment. To address this, we use EEG to measure the spontaneous brain activity of infants’ brain while they’re attending to motion cues conveyed by either vision or audition alone or by the two senses combined together.

We are currently inviting 5 and 8 month old infants to participate in this project.

Music and Emotion

For adults, certain types of music are almost always experienced as uplifting or even joyful. Others are sad or conjure up a fearful situation. Research shows that adults from a many cultures recognise these emotions in music and can match them with equivalent facial expressions. This suggests these might be universal aspects of emotion expressed through music. Investigating this with babies helps us discover that. And in fact, previous studies have shown that babies can match happy and sad music and expressions. Our current study extends this to also look at fearful music and expressions. The research will gives us insights into which aspects of music babies understand and will allow us to learn more about musical development in general.

You will be asked to complete some background questions and then your baby will listen to several different musical clips each presented with two faces. To measure your baby’s responses we will film their reaction. You will be with your baby at all times.