Today’s educational technology often presents itself as a radical departure from the tired practices of traditional instruction. But in one way, at least, it faithfully follows the conventions of the chalk-and-blackboard era: EdTech addresses only the student’s head, leaving the rest of the body out.
Treating mind and body as separate is an old and powerful idea in Western culture. But this venerable trope is facing down a challenge from a generation of researchers—in cognitive science, psychology, neuroscience, even philosophy—who claim that we think with and through our bodies. Even the most abstract mathematical or literary concepts, these researchers maintain, are understood in terms of the experience of our senses and of moving ourselves through space.
This perspective, known as “embodied cognition,” is now becoming a lens through which to look at educational technology. Work in the field shows promising signs that incorporating bodily movements—even subtle ones—can improve the learning that’s done on computers.
For example, Margaret Chan and John Black of Teachers College of Columbia University have shown that physically manipulating an animation of a roller coaster—by sliding the cars up and down the tracks and watching the resulting changes in kinetic and potential energy, as shown in a bar graph—helps students understand the workings of gravity and energy better than static on-screen images and text. This embodied approach to instruction, the authors found, is especially helpful to younger students and to those working on more difficult problems. In counterintuitive domains like physics, bodily rooted learning allows the learner to develop a “feel” for the concept being described, a physical sense that is more comprehensible and compelling than a concept that remains an abstract mental entity.
In similar experiments, led by Insook Han of Hanyang Cyber University in South Korea, students learn about the concept of force by using a joystick to move two gears shown on a computer screen. Han’s studies show that allowing users to physically manipulate the gears in this way improves their memory and problem-solving performance on force-related questions. The richer the perceptual experience provided by the computer program, the greater the students’ understanding and retention of the material.
One reason involving the body improves learning is that bodily movements provide the memory with additional cues with which to represent and retrieve the knowledge learned. Taking action in response to information, in addition to simply seeing or hearing it, creates a richer memory trace and supplies alternative avenues for recalling the memory later on. Movement may also allow users to shed some of their “cognitive load”—the burden imposed by the need to keep track of information. Instead of trying to imagine what the gears would do if moved, a mentally taxing activity, learners can allow their hands to do it and see what happens, freeing up mental resources to think more deeply about what’s happening. This is pretty much how we all learned to drive.
From an evolutionary perspective, our brains developed to help us solve problems in the real world, moving through space and manipulating actual objects. More abstract forms of thought, such as mathematics and written language, came later, and they repurposed older regions of the brain originally dedicated to processing input from the senses and from the motor system.
This repurposing is apparent in the frequency with which we use physically grounded metaphors to express abstract ideas: counting is like moving through space (“the countdown is approaching zero”); accommodating two different principles is like “balancing” them on a scale. Bringing the body back into the equation can provide learners with a useful way station between concrete referents and all-out abstraction. Physically acting out knowledge to be learned or problems to be solved makes the conceptual metaphors employed by our brains a literal reality.
We can see this principle at work in the research of Arthur Glenberg of Arizona State University. In a series of experiments carried out more than a decade ago, Glenberg found that children’s reading comprehension improved when they acted out a written text, using a set of representational toys (a miniature barn and horse, for example, accompanied a story about a farm). Glenberg then demonstrated that the same procedure could work on a digital platform: In a 2011 experiment, he showed that having first- and second-grade students manipulate images of toys on a computer screen after reading a story benefits their comprehension as much as physical manipulation of the toys.
Mina Johnson Glenberg (who is married to Arthur Glenberg and also works at Arizona State, as director of the university’s Embodied Games for Learning lab) is taking the embodied approach even further, designing educational games that engage learners’ entire bodies.
A program called the Alien Health Game, for example, presents students with this scenario: “You have just woken up to find an alien under your bed. It is hungry and it is your job to figure out what makes it healthy.” From an array of foods, users learn to choose the ones that are most nutritious, and then must dance, jump, and exercise to help the alien digest his meal. (A bonus: The game is so physically active that it measurably elevates users’ heart rates.)
In other work, Johnson Glenberg employs Xbox Kinect–like technology to capture students’ movements as they interact with images projected onto a whiteboard. The interface can be used to teach subjects like physics, chemistry, and geology; in the geology module, for example, students use the movements of their bodies to deposit sediment layers, insert a fossil layer, and generate earthquakes in a virtual environment. The technology is being used in a half dozen American schools, including Quest to Learn in New York City and ChicagoQuest.
It’s early days for these real-world applications. But the research behind them can help teachers, parents and students better evaluate and use educational technology products that are already widely available. An awareness of embodied cognition, for example, might lead users to prefer touch-sensitive devices like the iPad, which respond directly to the movement of users’ fingers on the screen, over computers that interpose a keyboard between screen and user. According to John Black of Teachers College, technology that evokes movements that complement the concepts to be learned is also likely to be effective from an embodied point of view: For example, an application in which counting is expressed by tapping on a mouse (discrete movements that complement the discrete nature of counting) will better promote learning than a program that asks users to make a sliding movement as they count (a continuous action at odds with the discrete nature of counting).
Parents and educators can also treat the virtual “movements” involved in many educational programs and games as preparation for more traditional learning. For example, John Black and Jessica Hammer, also of Teachers College, showed that moving through the virtual spaces of the history game Civilization made players much better at learning history from a conventional textbook than players of the game The Sims. Even though Civilization players possessed no greater knowledge of history when they opened the textbook, their virtually embodied experience of grappling with the issues raised by historical simulations made them better prepared to absorb the lessons of the text.
Educators and parents can also help students incorporate bodily movements of their own into the use of educational technology, an approach that Black has applied to the teaching the programming language Scratch. Asking students to act out the motions they intend for the program’s virtual “agent” using their own bodies, and then programming the agents to make the same moves, has shown itself to be “a particularly effective learning approach,” Black writes. Even when they’re learning on computers, it’s wise to remember that students are more than mental machines.
As a Schwartz Fellow, Annie Murphy Paul will write a book about how we learn and how we can do it better. “Brilliant: The New Science of Smart” will be published by Crown and will explore how cognitive science, neuroscience, and the psychology of learning can be applied to worker training programs. This article was originally published in The Weekly Wonk, New America‘s digital magazine.
Views expressed in this article are the opinions of the author and do not necessarily reflect the views of The Epoch Times.