Get our latest essays, archival selections, reading lists, and exclusive content delivered straight to your inbox.
Psychologists often use puzzles to test intelligence. So, puzzle this: on the one hand, many psychologists tell us that intelligence is an enduring individual trait, pretty much hard-wired by a person’s DNA and by cell development in the fetus’s brain. On the other hand, testing shows that there have been huge increases around the world in IQ scores over the last two or three generations—so large that most Western adults a century ago would be considered dimwits by today’s standards.
This puzzle emerged a couple of decades ago from psychologist James Flynn’s discovery of what is now labeled the Flynn Effect. He found that, in the United States, for instance, IQ scores rose about fifteen points between roughly 1950 and 2000. They are still rising. The ascent in scores can be quite rapid—a half-point per year for young eastern German men in the 1990s. Fragments of earlier American data and more complete testing in other Western nations suggest that the surge in scores began early in the last century. IQ scores may now be leveling off in northern European nations even as they are jumping in developing nations.
For about twenty years, psychologists have repeatedly confirmed this finding and argued about its meaning. The stakes are high because intelligence scores matter: they are the strongest predictors of how well people do in life. Solving the puzzle of the Flynn Effect has implications for understanding intelligence, intelligence testing, the brain, and social change. And it has implications—doesn’t everything?—for our politics.
• • •
Some observers respond to questions raised by the Flynn Effect by dismissing intelligence testing as an exercise in cultural domination. This ostrich-like response ignores the fact that IQ scores, whatever they measure, consistently correlate with important outcomes such as how well people perform their jobs and how long they live. Such dismissal also ignores the growing evidence that there is a physical, neurological basis to cognition and cognitive skills.
A more serious critique of the research attacks the definition of intelligence. Researchers in the intelligence field define it as a general capability to reason, understand complex ideas, think abstractly, and solve problems. You can measure it, they argue, using IQ tests. Critics consider these tests to be superficial and argue that they ignore other kinds of intelligence such as emotional intelligence or deeper traits such as wisdom. While researchers cannot track historical trends in wisdom, they are trying to wise up about the apparent historical increase in IQ.
One might suspect that the tests have gotten easier. They haven’t. In fact, the tests have gotten harder in order to keep the average IQ at one hundred. By reversing that process, Flynn showed the long-term rise in real performance.
Other challengers argue that we are not really smarter than our great-grandparents; it’s just that people today learn the answers to test questions in school or have become familiar with testing. However, scores on the parts of tests that are most easily taught and are the most culture-laden—say, recognizing vocabulary or knowing geography—have not changed much. Scores on those parts of tests that measure the most abstract, presumably culture-free thinking—say, drawing logical inferences from patterns in designs—have risen the most. The sorts of thinking that are supposedly most detached from classroom and cultural learning are the ones that have really improved.
So if a real increase in some kind of cognitive ability is under way, the question is why.
Some researchers argue that twentieth-century reductions in disease and improvements in nutrition produced healthier brains. Others, including Flynn, reject such biological explanations, pointing out, among other things, that scores continued to rise in affluent countries generations after they had achieved modern health and nutrition.
A stronger explanation of the Flynn Effect is that we do better abstract thinking than our ancestors did because our social environments promote abstraction. Universal extended schooling trains many more people in abstract thought, be it through Algebra I or role-playing in social studies classes. Modern work tasks, such as setting a machine tool or filling in a spreadsheet, call for and reinforce our ability to manipulate symbols. Our surroundings are increasingly filled with symbolic representations such as the icons on car dashboards and on computer “dashboards.” We also absorb the abstract codes of the omnipresent media. Toddlers learn to experience video, for instance. In flashing images on a flat surface, each from a different angle of vision, knit together with sound emerging from yet elsewhere, they see Miss Gulch taking away Dorothy’s Toto. And then they cry in fear about it. And an hour later, they ask to see it again, since they knew all along that it was not real—whatever “real” means in such a world.
While we are today comfortable with sorting things abstractly—hammers and hoes are tools, while corn is a food—our ancestors tended more commonly to make concrete connections—hoes go with corn because hoes are used to raise corn.
Flynn puts it this way: “If the question is ‘Do we have better brain potential at conception . . . ?’ the answer is no. If the question is . . . ‘Have we developed new cognitive skills that can deal with [modern cognitive] problems?’ the answer is yes.”
Explanations of his Effect that focus on changes to the sensory environment are consistent with the growing number of studies that demonstrate the “plasticity” of the human brain. My favorite such study is of London cabdrivers, which found that as the cabbies learned to navigate the city, the part of their brains associated with visual maps thickened. Just learning to read—that is, learning to turn scratch marks into meaning—reshapes brain structure. Studies show that even the elderly can expand their cognitive acuity and alter their brains. We are not, as once was thought, doomed to get ever stupider after our twenties, although we can get stupider in lazy retirement.
The already-puzzling Flynn Effect has become a piece of another puzzle—understanding social and economic inequality.
It’s an old debate, fired up white hot by the publication of The Bell Curve in 1994: How much economic and social inequality is accounted for by inequalities in innate cognitive skill and how much by environmental circumstances? The book’s authors, Richard Herrnstein and Charles Murray, argue that individuals’ life outcomes are largely determined by their intelligence, which is largely determined by their genes. Furious responses to this argument rained down from the left.
The political logic of this debate holds that if inborn intelligence, far more than environment, determines outcomes, then social engineering for greater equality is pointless. (Ironically Herrnstein and Murray drew from their analysis a recommendation for state paternalism, such as income supplements, to sustain those who are genetically doomed to remain unintelligent and unsuccessful.) Psychologist Carol Dweck has long argued that believing that intelligence is fixed becomes a self-fulfilling prophecy for students, discouraging ambitious effort. In a recent paper, she and her colleagues find that if people are led to think that anyone might become more intelligent, they are also likelier to support redistribution of education spending and even to favor hiring quotas.
The emerging understanding of intelligence, even as biologically rooted, suggests that it can be boosted beyond its starting point. We may not be able to shape the brain at conception or birth, but we can shape the physical brain and its operations in life. And perhaps we can shape it more intelligently than have whatever forces are responsible for the Flynn Effect.
…we need your help. Confronting the many challenges of COVID-19—from the medical to the economic, the social to the political—demands all the moral and deliberative clarity we can muster. In Thinking in a Pandemic, we’ve organized the latest arguments from doctors and epidemiologists, philosophers and economists, legal scholars and historians, activists and citizens, as they think not just through this moment but beyond it. While much remains uncertain, Boston Review’s responsibility to public reason is sure. That’s why you’ll never see a paywall or ads. It also means that we rely on you, our readers, for support. If you like what you read here, pledge your contribution to keep it free for everyone by making a tax-deductible donation.
Vital reading on politics, literature, and more in your inbox. Sign up for our Weekly Newsletter, Monthly Roundup, and event notifications.
Historian Gerald Horne has developed a grand theory of U.S. history as a series of devastating backlashes to progress—right down to the present day.
Reflecting on three monumental works of modernism—James Joyce’s Ulysses, T. S. Eliot’s The Waste Land, and Ludwig Wittgenstein’s Tractatus Logico-Philosophicus—a hundred years on.
Both regulators and employers have embraced new technologies for on-the-job monitoring, turning a blind eye to unjust working conditions.