Does a Bigger Brain Mean a Higher IQ? Nope, and This Is Why

The idea that ‘more is better’ owes its start to controversial theories.

The brain. Credit: TheDigitalArtist/pixabay

As a student of neuroscience, what impresses me the most about the way science has impacted modern culture is in the number of movies that talk about our brains. Many of them do a good job of describing how beautiful and interesting our brains are. Pixar’s Inside Out (2015) was a masterpiece. It deepened our interest in understanding how the organ produces our emotions and stores our memories. It also effectively de-stigmatised sadness and depression.

But these brain-based box-office hits can also get the science spectacularly wrong. Occasionally, it’s indicative of how the science of the brain has been introduced in our society.

In the 2011 film Limitless, Bradley Cooper is a struggling writer who, with the help of a pill, is suddenly able to “access all 100%” of his brain. In Lucy (2014), Scarlett Johansson gets access to 100% of her brain and beats up a bunch of bad guys. The outrageousness in these movies allows most viewers to classify much of what they see as fantasy. However, dialogues like the one delivered by Morgan Freeman’s character in Lucy stick: “It is estimated most human beings only use ten percent of their brain’s capacity”.

If we can figure out a way to use more of our brains, can we be smarter? And if 100% is too much, what about 20%?

Controversial origins

Time out! This “10%” idea is a myth thought to have been constructed by the author Dale Carnegie in his 1936 book How to Win Friends and Influence People. Over the course of our lives, we likely use all of our brains and all of the neurons in the brain. How many neurons of which kind are activated at a given point depends chiefly on what we’re trying to do at that time. For example, reading a book would require us to use our visual systems and the brain areas involved in language processing, together with those for controlling our hands and eye muscles. On the other hand, deciding whether to sleep for longer or to get up requires a completely different set of areas. Going from knowing which areas are involved to the percentage of brain that is active isn’t easy at all.

However, the idea that ‘we use 10% of our brains’ has captured human imagination and spread widely through society. If activating more of our brains can make us better, then having more brain area and more neurons should be desirable. It has led to the belief that a larger brain means a higher IQ.

Recently, scientists at the National Brain Research Center (NBRC), Manesar, have started work to create an ‘Indian brain template’, a picture of what the average Indian’s brain looks like. Earlier studies that attempted to build such a template have found that Indian brains are smaller than their caucasian counterparts.

Naren Rao, a coauthor of the earlier study, told The Hindu last week that smaller brain size does not imply less intelligence. The fact that brain scientists even feel the need to say this speaks to how deeply the notion that ‘more is better’ has entrenched itself.

More is not always better. More neurons and more activity than necessary can often lead to dysfunction.

In the mid-19th century, Samuel George Morton, a physician from the University of Pennsylvania, suggested that brain sizes differed between races, that the Caucasians had the largest brains followed by the Native Americans, the Africans and the African Americans. Morton clubbed this notion with results from early IQ testing to propose that the size of the brain was a measure of one’s intelligence, and that the Africans and African Americans were less intelligent for this reason.

These findings have since been widely criticised, with author and evolutionary biologist Stephen Jay Gould writing that Morton’s results were simply artefacts of his intrinsic biases.

Staving off hyperactivity

Since Morton, a large body of work in anthropology, genetics and psychology has questioned the validity of his results. Women are known to have smaller brains than men but on average perform better on IQ tests. Several studies have shown that the differences in test performance between Caucasians and African Americans can be ameliorated if African American test-takers are told that they are solving a puzzle and not taking a timed high-pressure test.

Based on such results, scientists have suggested most differences in IQ test scores are due to differences in educational resources test-takers have access to, plus societal biases. Questions have also been asked about what intelligence really is and whether IQ tests actually measure intelligence.

The idea that ‘more is better’ owes its start to such controversial and since-rejected theories.

Basic neuroscience research has supplied many counter-examples. Consider the nerve cells that collect sensory information from our limbs and pass it on for our brains to process. In the 1930s, Rita Levi-Montalcini and Viktor Hamburger showed that, as an embryo develops, more of these sensory neurons are born than are necessary. These neurons receive a molecular signal from the sensory receptors that convert physical stimuli into electricity for the neurons to use. This molecular signal allows these neurons to survive. However, because only a limited ‘amount’ of this signal is present, only a limited number of neurons survive. Hamburger extended these findings to neurons in the spinal cord and others have shown that it is true in other regions of the brain as well.

The number connections between neurons – a.k.a. synapses – is also reduced both before and after birth. This is called pruning and it is essential for information to be communicated properly between different parts of the brain. Incomplete synaptic pruning is believed to be the cause of synesthesia in adults, a condition where two different qualities of objects become mixed. For example, synesthetes might associate sounds with some colours.

For the brain to function properly, it’s clear that it needs a certain number of cells and connections between them. Too many of either can be bad for the organism’s survival.

The brain processes information and guides behaviour through the electrical activity of its neurons. This activity is generated when ions move from within neurons to without through molecular channels embedded in the neuron’s membrane. When a specific number of ions crosses the membrane, the neuron produces an action potential. This is the information currency used by the brain.

Most of the time, large increases in the number of action potentials produced in any region of the brain – called hyperactivity – portend bad news for the brain. For example, in people suffering from seizures, the molecular channels through which the ions flow are affected in such a way that the neuron’s action potential limit is reached more easily, leading to hyperactivity.

In patients suffering from Alzheimer’s disease, the formation of plaques – “abnormal clusters of protein fragments build up between nerve cells”, according to ALZ – in the brain and reduction in neuronal activity associated with the disease is preceded by a period of hyperactivity, observed in in the hippocampus and the cortex. Scientists think it is caused by a shortfall in the number of cells that help inhibit the hyperactivity. In this case, it seems fewer cells leads to more activity.

Getting it just right

Findings like the ones discussed above suggest that, like Goldilocks, the brain has settled at a point that is just right. Just the right number of cells talking to each other in just the right number of ways leading to just the right amount of activity for a task.

An important part of achieving this balance is the cell that ‘polices’ other cells, keeping them from overperforming using negative feedback loops. They help convert sensory information into a form that is more usable by various parts of the brain – to make decisions, form and recall memories and behave appropriately.

For example, in the olfactory system, these inhibitory neurons play a role in understanding an odour the animal is smelling. The olfactory system starts at the nose, where olfactory receptor neurons (ORNs) convert odours into electrical activity and send them through nerve fibres (axons) to a part of the brain called the olfactory bulb. In the olfactory bulb, the axons of all the ORNs that responded to the same group of odours come to together to form a structure called the glomerulus. The glomeruli then organise this information and send it to the next layer of odour-processing neurons.

In this system, inhibitory neurons are present between and within glomeruli. They are activated by ORN activity and perform a kind of gain control, ensuring that excitable neurons receiving inputs from ORNs don’t become hyperactive. Each odour we smell elicits a specific pattern of activity in the olfactory bulb; without the inhibitory neurons in the bulb, this pattern would be replaced by chaos.

So, more isn’t always better. Neurons are killed, pruned and silenced by other neurons to ensure that the brain is neither underwhelmed nor overwhelmed as it keeps us going. We use different parts of our brains to perform different tasks and we likely use all of our brain at different parts of the same day. There is no fixed percentage of the organ that we’re limited to. And in the long run, the exact timing of which of our neurons are active when is likely more important than how many.

Adithya Rajagopalan is a PhD student in the neuroscience department at Johns Hopkins University.