Contingency in infant-directed speech: Neurophysiological and psycho-social responses in pre-linguistic infants
We are looking at brain responses in 6-9 month old babies when in spontaneous conversation with their parent. Adults will also be tested as a control.
Do all babies grow up and behave the same way, or are they quickly become different human beings under different cultures? We try to answer this question by comparing babies across backgrounds and labs in the world.
In partnership with Karitane and as part of our research into links between early language exposure and child language development, we are collecting infant-directed speech in groups of mothers with postnatal depression, mothers not clinically depressed but experiencing low mood, and mothers without depression.
An examination of the mother-infant interaction, comparing two groups of mother-infant dyads (control and at-risk) to determine the impact of maternal depression and anxiety upon the quantity and quality of Infant Directed Speech and infants' early linguistic development.
In this project, we are validating a new tool, the Functioning Listening Index (FLI), that will be used to measure listening skills in children who have normal hearing or who have hearing loss. The FLI can be used with children from birth until 6 years of age, and it will be an important screening and diagnostic tool.
This experiment addresses the infant general patterns towards infant-directed speech, infant gaze following patterns from various language and cultural backgrounds.
Mummy, why does that lady talk funny? Word generalisation and adaptation to unfamiliar regional accents can reveal the path of early word learning.
The purpose of this project is to investigate infants' ability to learn and recognise words when they are spoken in a familiar accent (Australian English) and an unfamiliar one (Cockney English). Benefits of the research include increasing our understanding of the development of speech and language.
It is well known that parents modify the qualities of their speech when they speak to their young infants. However, there are large individual differences in the extent to which each parents does so, and these differences relate significantly to the infants' language development. In this study, we assess the effects of a brief training program on maximising the extent to which parents modify the qualities of their speech that can support their baby's language development.
Seeds of language development: Development of hearing impaired infants’ speech perception and vocalisation over the first three years of life
This project investigates how infants with and without hearing loss develop early language skills from birth until 2 years of age. We are interested in finding out how hearing loss affects the development of the ability to perceive language sounds, and how these effects can be medicated.
All humans speak differently, even in the same language, but we're very good at adapting to these differences!We are interested in examining how we do this depending on language background (monolingual or bilingual) and on the type of information we hear. We use infants' electrical brain signals to determine how sensitive they are to these changes which gives us information about how they use their language representations to adapt to speech.
This study investigates how young infants perceive different speech registers via the electroencephalogram (EEG) procedure. The experiment involves baby sitting on parent's lap in front of a monitor and listening to different sounds presented over the loudspeakers.
Infant-directed speech (IDS), also known as ‘baby talk’, facilitates early language processing and word learning. However, IDS has many different properties, including exaggerated positive emotion, vowel hyper-articulation, higher pitch and pitch variability, slower tempo, and shorter sentences. Here, we study how specific properties of the speech may be more important for word learning than others.
This project investigates how visual speech information influences infants’ speech perception by seeing if and how visual speech cues from a speaker's talking face may augment infants's and children's speech perception. It also will examine if the visual speech benefit differs between those with normal hearing and those with hearing impairment.
This research project aims to understand how parents with young infants use digital technology to communicate when they are geographically separated from their infant.Families with an infant aged between 6 - 12 months will be invited to participate in this study at the MARCS BabyLab. Two brief interactions will be audio and video recorded for the researchers to analyse. Mothers will also complete a short interview following the interaction.
In this study, we look at what cues help babies tell the difference between their native language and a different language. Specifically, we compare babies who learn English, a language that only uses consonants and vowels to distinguish word meaning, and Thai, a language that also uses tones to distinguish word meaning.