Have you ever wondered why we say “bought” rather than “buyed?” Or why some of us drink “soda” while others drink “pop?” Or why it is easy to pick up a new language as a child, but hard as an adult?
Understanding language is a complex endeavor that represents an academic gold mine for researchers across many disciplines. The greatest of all social tools, language lets people share knowledge and feelings, conduct business, make plans, tell stories, give instructions, and transmit culture from generation to generation.
But beyond facilitating interpersonal interactions, language provides a window to the inner workings of the human mind. As a field of study, it connects the humanities, the social sciences, and the natural sciences, incorporating areas as diverse as psychology, neuroscience, anthropology, philosophy, education, computer science, and artificial intelligence.
“How we learn, process, and use language, and how we change language over time—these are really fundamental aspects of being human, so a lot of different experts are asking these questions,” says professor of psychology John Trueswell. “Having multiple disciplines collaborating and using different methods to find the answers is what makes the study of language so interesting.”
Using a variety of observational and experimental approaches, Penn Arts and Sciences faculty are delving deeply into the mechanisms underlying language acquisition, interpretation, and production. Their work reveals the intricate ways in which people shape language—and in which language shapes us.
Gems versus junk
Children generally make acquiring language look effortless. As infants become toddlers, they simply begin to speak, without formal instruction.
Trueswell knows this is harder than it appears. “It would seem that children could learn the meanings of their first words simply by looking at the world around them—shoes are around when they hear the word ‘shoe’—but the problem is that the world gives them both too much and too little information to figure out what most words mean,” he says.
The “too much” comes into play with concrete nouns like “ball” or “book,” which could refer to any item a baby can see—or even one they can’t. Trueswell and longtime collaborator Lila Gleitman, emerita professor of psychology, have demonstrated how difficult it is to match words with objects in the absence of verbal clues, even for adults. In their experiment, participants watched muted videos of parent-child interactions, then heard a “beep” at the moment a parent said a concrete noun. When asked to guess the word, they often had no idea.
“Identifying a particular word’s referent, especially when you don’t know any words to begin with, is not a simple task. Moments when physical cues make it clear what a parent is referring to are the most influential word learning moments for children, but these referential ‘gems’ are rare, and the rest of the time, it’s junk,” says Trueswell, who encourages parents to speak about the “here and now” to help their children learn more words, as a child’s vocabulary size at school entry is a predictor of future academic success.
The world offers infants “too little” information, Trueswell says, when it comes to verbs like “give” and “get” and abstract nouns like “idea” or “knowledge.” In these cases, they must infer meanings based on their existing vocabulary and “this is likely why these kinds of words are learned later.”
Trueswell also devotes much of his time to analyzing children’s eye movements as they listen to spoken descriptions of their surroundings. “When you can see where someone is looking in the world as they’re hearing speech about that world, you get a moment-by-moment record of what they think an utterance means,” he says. “This gives us insight not only into what they know about language but also how they process it in real time.”
The sounds that matter
Daniel Swingley, professor of psychology, directs Penn’s Infant Language Center where, like Trueswell, he uses observational studies to assess how children acquire language, including the words they comprehend before they are able to speak.
“Babies are thrown into an environment where everyone is using this very complicated signal to convey information, and they have to figure out that there are words, as well as which sounds are meaningful for telling words apart. A baby learning English has to be able to distinguish similar sounds like bit, bait, bat, and bet—but babies in other environments have to learn other sounds,” says Swingley, who aims to understand how infants come to grasp these features of their language.
Swingley uses “language-guided looking” to evaluate word recognition: Infants and toddlers in his lab watch a screen showing images of two different objects, and as researchers begin talking about one of the objects, children will rapidly look to the corresponding image if they already know the word for it. Swingley says babies as young as six months have shown evidence that they understand the meaning of certain words.
Babies are thrown into this environment where everyone is using this very complicated signal to convey information, and they have to figure out which sounds are meaningful. Daniel Swingley
Infant Language Center researchers also test children’s knowledge by adjusting the way they pronounce various terms. They have found that changing “dog” to “tog,” for example, will lead many children—including those who do not yet speak—to reject an image of a dog, indicating that they realize pronunciation matters.
“This is interesting because even when we’re talking to infants, the language adults use is not optimal; we’re lazy, we don’t over-articulate, we make elisions and leave words out. But amazingly, babies can somehow make sense of these very messy signals,” Swingley says.
By analyzing children’s “language environment”—the way their family engages them in conversation—Swingley also examines how specific features of family talk lead to infants’ learning of particular words and sounds.
“By studying how children turn experience into knowledge, we’re building a picture of how language development works to help us understand why it sometimes seems to go awry. For conditions where language doesn’t operate in the expected way, it’s useful to know about the normal course of development,” he says.
Choosing and using our words
Even if they speak the same language, people don’t always understand each other. Delphine Dahan, associate professor of psychology, wants to know why.
“We don’t realize how flexible language is,” Dahan says. Beyond the endless variety of tones, volumes, pronunciations, and emphases that can change the meaning of a statement, “sometimes it’s the terms you choose that keep you from being understood—the terms you think are most effective to describe something might not be effective for another person at all.”
Dahan’s current studies focus on verbal interactions. Her experiments involve two participants who hold identical sets of cards that picture various objects; one person describes an object from one of the cards, and the other identifies the card in question. Depending on whether an accurate match was made, Dahan analyzes why a speaker’s communication was successful or unsuccessful.
The game seems simple, but Dahan observes errors even with the most mundane objects. If several cards feature sneakers, she notes, a participant might describe one using its color, its brand, the sport it’s commonly worn for, or some other variable that may or may not resonate with the listener, even if the speaker believes it will. Her goal is to highlight the complexity behind choosing an expression, which relies heavily on perspective-taking. For example, she explains, you probably wouldn’t describe a pair of shoes in the same way to a child as you would to a teenager, because you assume different things about them.
“I want to see how people use their knowledge both to communicate what they mean and to understand what others mean,” says Dahan, who has found that demographics—particularly education level—factor into participants’ performances and is interested in how communication skills and styles can create inequalities among different groups.
“There’s an assumption that what we are studying is so basic that results will be the same across the entire population, but that’s not true. I want to understand why that is and how it matters in people’s everyday lives, where they are in society, and if we should be doing something about it,” she says. “People’s ability to express themselves through language and to understand what others are expressing is critical, because verbal communication is the base of everything in our society.”
Artificial languages: the fruit flies of linguistics?
Studying the evolution of language is tricky: Since language change happens slowly, unfolding generation after generation, laboratory research is not an option.
Or is it?
Gareth Roberts, assistant professor of linguistics, examines how interpersonal interactions influence language change by doing experiments in which people communicate using made-up languages.
“In biology, fruit flies are used to study evolutionary processes because they reproduce very rapidly, so you can watch evolution happening. We do the same thing with these miniature artificial languages. They are very small and they are new to our participants, so they mutate faster than language does in the real world,” Roberts says.
Describing language change as a form of “cultural evolution” because it involves socially rather than biologically transmitted behavior, Roberts has recently looked at stereotyping’s influence on language. Participants, in an experiment he conducted with Betsy Sneller, played a computer game in which they acted as different alien species who could fight each other, with one of the species designed to look and sound tougher than the second. In that scenario, the “weaker” aliens borrowed language features from the tougher aliens they encountered—but later, when the experiment was repeated with the social importance of toughness eliminated by removing fighting from the game, that effect disappeared.
These results suggest that people are likely to adopt elements from another group’s language if certain stereotypes applied to that group appeal to them. Roberts uses studies like this to deduce how language in the real world evolves over time, with pronunciations shifting and new words emerging.
“In my work, I get to do something that didn’t seem possible to most people until recently, and still doesn’t seem possible to some,” he says. “I create a small chunk of language and get to see it change right there in the lab, watching how it reacts to little prods and pulls to make steps in understanding why language changes.”
How people “tawk” in Philadelphia
Philadelphians are famous for their distinct regional dialect. They eat “wooder” ice, drink “cawfee,” and go “daown” the shore. But research shows that many of these well-known variations are diminishing, while new ones are stealthily popping up in their place.
Meredith Tamminga, assistant professor of linguistics, is studying the drivers of these changes to “Philadelphia English.”
“One change nobody notices but that there is strong evidence for is that a long ‘a’ like in ‘plate’ is becoming more like ‘ee,’ so the word sounds like ‘pleet.’ I can’t tell you why this is happening, but we’re interested in finding out,” Tamminga says. She is building on the work of her mentor, William Labov, the retired John H. and Margaret B. Fassitt Professor of Linguistics, who spent half a century analyzing the evolution of Philadelphia accents, particularly in relation to socioeconomic status.
In her current National Science Foundation-funded project, Tamminga is looking at language use within—rather than across—specific demographics to assess what characteristics cause individuals to modify their speech. Starting with young white women, researchers in her lab are observing friends in casual conversation to see if they use the accent variations already known to be emerging in Philadelphia. They then separate participants and evaluate their phonetic flexibility, or tendency to imitate the speech of others while talking to them. The resulting data will reveal whether there are connections between people’s phonetic flexibility and the likelihood that they will adopt and promote language change.
“We’re building a bridge between data from sociolinguistics, which puts a lot of emphasis on natural, conversational speech, and psycholinguistics, which uses controlled, laboratory-based experiments to study how we mentally process language—how we hear sounds and recognize words,” she says.
Tamminga believes her studies can help eliminate the stigma associated with language change, which is often viewed negatively. “People who push language change forward by using certain dialects are seen as lazy or uneducated, but the truth is, all languages are in flux and always have been. This is a natural, inevitable process and not a form of degradation,” she says.
Beyond the tipping point
A professor of linguistics and computer science, Charles Yang sees the two seemingly disparate fields as inextricably linked. “Language is a machine, and its core engine is in place by the time we are four or five years old. Whatever makes us able to learn language, then, has to be a simple—almost mechanical—system,” he says.
Yang has spent much of his career analyzing exceptions to linguistic rules, such as irregular verbs like “think” and “go” that do not follow the pattern of ending in –ed in the past tense. He ultimately developed a mathematical equation that calculates when children will deem mastering a rule and memorizing its exceptions worthwhile, as opposed to ignoring a rule altogether because it has so many exceptions, the rule itself has no valuable predictive power.
Recently, Yang began applying his equation, which he coined the Tolerance Principle, to the process of counting. Empirical studies from as far back as the 1980s have shown that once an English-speaking child can count to a “tipping point” of 73, he or she can continue counting indefinitely. However, no one understood why—until Yang explained it. “Counting is a process of learning rules of language and their exceptions. Once you figure them out, you can count forever,” he says.
Basic arithmetic skills tend to be predicted by how well children can count. People should talk to children—not to teach them math, but to teach them language. Charles Yang
There are 17 exceptions to the rules of counting: The words for numbers one through 10 are arbitrary and must be memorized, and the words for 11, 12, 13, 15, 20, 30, and 50 diverge from any expected pattern. “You don’t say 11 as ‘one-teen,’” Yang notes.
Yang is applying his research to other languages, each of which will have its own tipping point. Determining what that is has important educational implications, he says.
“Basic arithmetic skills tend to be predicted by how well children can count,” he says. “People should talk to children—not to teach them math, but to teach them language. A younger child who knows the rules of counting will be better than an older child who doesn’t when it comes to understanding that if you add 1 to 36, you get 37.”
Experts agree that human language is unique—not because it enables us to communicate, but because the combination of vocabulary and grammar permits us to communicate infinite ideas. Many animals can make sounds to warn their peers of looming danger or alert them to a food source, but none can convey thoughts like “That was an amazing nap” or “I really wish this rain would stop.”
Research duo Robert Seyfarth and Dorothy Cheney, professors emeritus in the departments of psychology and biology, respectively, have spent their careers analyzing communication in nonhuman primates and have identified parallels that might indicate how human language evolved from animal signaling.
“Animal vocalizations can mean very specific things and be associated with very specific events. The alarm calls of monkeys are a classic example. Also, animals can recognize the identity of the animal who is signaling,” says Seyfarth, whose extensive studies of baboons have consistently demonstrated that they comprehend strings of sounds—they respond differently, for example, to a series of vocalizations depending on whether the calls indicate that A is threatening B or B is threatening A. “It’s as if the baboon thinks in terms of a sentence with a subject, verb, and object. This kind of cognition is likely widespread in animals, certainly in socially living primates.”
Seyfarth believes language evolved because of humans’ needs to navigate complex social interactions and that human infants infer word meanings the same way nonhuman primates infer the significance of vocalizations: through context.
“Both creatures have to figure it out for themselves by integrating the sound they heard with what’s going on around them,” he says. “This suggests that some of the learning mechanisms we see in human infants who do not yet say words are similar to the ones we see in monkeys.”
Thought without language
Rather than studying language itself, Elizabeth Brannon, professor of psychology and the Edmund J. and Louise W. Kahn Term Chair in the Natural Sciences, examines the role language plays in human cognition by studying creatures that don’t have language.
“Studying thought without language is an avenue toward understanding how language influences thought and what kind of thought is unique to humans,” says Brannon, who examines whether animals and human infants can comprehend and represent number nonverbally.
To find out how monkeys think about numbers, she and her team train rhesus macaques to use touch screens so they can respond to pictures for food or juice rewards. Monkeys learn by trial and error to choose the numerically larger of two arrays, even when the numerically larger array has smaller items. They also test the monkeys’ basic math skills, asking them to choose an array that matches the sum of two other arrays or that indicates what is left if a subset of dots is removed. After extensive training, Brannon gives the monkeys numerically novel problems to make sure they are truly paying attention to number rather than simply learning to choose the response that results in reward.
“The monkeys are very good at these numerical tasks, and in all cases their behavior follows Weber’s Law—their ability to discriminate arrays is dependent on the ratio between the values rather than their absolute difference,” Brannon says. This means that arrays of 18 and 20 dots look much more similar numerically than arrays of four and six dots, even though both pairs differ by the same absolute value of two dots. “They’re clearly using a system that doesn’t have the precision that our symbolic number system allows us.”
With infants, Brannon’s team uses gaze duration and location to measure reaction to numerical changes. They “habituate” babies by showing them an array of dots repeatedly, then switching to a new display that includes that same array alongside one with a different numerical value. If they gaze longer at the numerically novel array—as they frequently do—Brannon infers that they can discriminate between differing quantities. Babies, like monkeys, are also limited by the ratio between the quantities.
“Our findings show us that there are precursors to mathematical abilities in monkeys and babies, which means language is not necessary for basic quantitative thinking,” Brannon says. “But without language, what animals and babies can’t do is appreciate the value of 1,362. Without a language- based counting system, they can only get so far.”
Like genes, language mutates
Suspecting that language evolves the same way living things do—through both natural selection and random changes—Joshua Plotkin, professor of biology, teamed up with linguistics professor Robin Clark and other researchers to test his theory.
The group analyzed more than 100,000 texts dating from the 12th to the 21st century, homing in on past-tense verbs that have evolved from regular to irregular (sneaked/snuck, think/thought). Most of these changes, they found, appeared to result from random chance rather than selective pressures. Occasionally, though, “survival of the fittest” was at play.
The team identified a pattern: Changes to rarely used verbs were due to chance—but changes to more common verbs were likely driven by selection, including people’s penchant for rhyming. For example, the “irregularization” of “dived” to “dove” coincided with the invention of cars and the corresponding use of rhyming irregular verb drive/drove. Additionally, expanded use of “quit” instead of “quitted” coincided with a rise in use of “hit” and “split.” However, says Plotkin, “the vast majority of verbs we analyzed show no evidence of selection whatsoever.”
This story, by Karen Brooks, originally appeared in Omnia Magazine.
Illustrations by Gracia Lam.
Homepage photo: The work of psychologists, linguists, computer scientists, and biologists at Penn reveals the intricate ways in which people shape language—and in which language shapes us.