Computer Trick Shows Pitch Is Just Perception

If researcher Elizabeth Petitti played two musical notes from her laptop, some people would hear the notes rise in pitch, while others would hear them fall. Why the difference?
Computer Trick Shows Pitch Is Just Perception
(zhongguo/iStock)
11/6/2015
Updated:
11/10/2015

If researcher Elizabeth Petitti played two musical notes from her laptop, some people would hear the notes rise in pitch, while others would hear them fall. Why the difference?

The answer may improve our understanding of how our auditory system develops and may help speech-language pathologists who work with people who have hearing impairment.

Petitti says the answer comes down to the way our brains perceive two components that make up sound: fundamental frequency and harmonics.

A note’s fundamental frequency is the primary element of sound from which our brains derive pitch—the highness or lowness of a note. Harmonics give a note its timbre, the quality that makes instruments sound distinct from one another.

Many sounds in the world are made up of these tones, whether you strike a key on a keyboard, play a note on a clarinet, or say a letter, says Petitti, who graduated from Boston University’s Sargent College of Health & Rehabilitation Sciences with a master’s in speech-language pathology.

Our brains expect the fundamental and the harmonics to be present in any given note. But when some of this information drops out, “the way you perceive the note can change in surprising ways,” says Petitti’s mentor, Tyler Perrachione, a professor at Sargent and director of the Communication Neuroscience Research Laboratory.

‘Pitch Exists Only in Our Minds’

Petitti explains that when she removes the fundamental from a tone (using signal processing software), and then plays that note, the listener’s brain automatically supplies the pitch. People’s brains deliver this information in different ways: They either fill in the missing fundamental frequency—similar to the way the brain would compensate for a blind spot in our eye—or they determine the pitch from the harmonics.

Here’s where it gets interesting: When two different tones that have been stripped of their fundamentals are played in succession, some listeners hear their pitch rising, and some hear it falling. Who’s right?

“There’s no right answer,” Perrachione says. “Pitch only exists in our minds. It’s a perceptual quality.” So, how exactly do we determine pitch? It turns out the language we speak plays a role.

Does Native Language Matter?

Petitti and Perrachione theorized that individuals who grew up speaking a tone language like Mandarin would perceive pitch differently than those who grew up speaking a non-tone language like English. In Mandarin, for example, a word often has several meanings, depending on how the speaker employs pitch; mā (with a level tone) means “mother,” while mă (which drops, then rises in tone) means “horse.”

To test this theory, Petitti invited 40 native-English speakers and 40 native tone language speakers to participate in a study, which she and Perrachione presented at the International Congress of Phonetic Sciences in August 2015. Each participant listened to 72 pairs of tones stripped of their fundamental frequencies and then indicated if the tones were moving up or down.

Petitti and Perrachione found that language does change the way we hear. Individuals who grow up speaking English are more attuned to a note’s harmonics, while the tone-language speakers are more attuned to its fundamental. So, when a note is stripped of that component, they’re more likely to derive pitch by supplying the missing fundamental than by listening to the harmonics still present in the note.

Musical Training and Learning Language

These results led Petitti and Perrachione to wonder if the difference in pitch is grounded in our earliest language acquisition, or if other experiences can also affect how our brains process sound. For instance, would musicians—who also rely on pitch—perceive sound the same way as tone-language ­speakers?

When they put the question to the test, Petitti and Perrachione found that neither the age at which a musician began studying nor the number of years she or he had practiced affected her or his perception of pitch. To Petitti, this suggests that while you may begin learning an instrument as early as three, “you start language learning from birth,” she says. “So your auditory system is influenced by the language you are exposed to from day one.”

It’s not just theoretical. “Big picture: We are interested in how brains change with experience and how our experiences predispose us to certain auditory skills,” Perrachione says. This understanding could “help us better understand the opposite, when things don’t work quite right,” such as when a person has a disorder like amusia (tone deafness).

Petitti underscores the study’s potential clinical impact; in her career as a speech-language pathologist, she intends to work with clients who have hearing impairments, which will involve teaching them to perceive and use pitch. This ability is “crucial when you’re teaching how to ask a question, and how to use pitch to signal the difference between words,” she says—all skills we typically begin to develop early and unconsciously.

This article was originally published by Boston UniversityRepublished via Futurity.org under Creative Commons License 4.0.