By the time you are 20

Maria Isabel Garcia

This is AI generated summarization, which may have errors. For context, always refer to the full article.

Learning is more complex than just going to school, getting a job, and playing Sudoku or crossword puzzles to keep your mind together later in life

At middle age or older, are you as fired up to learn as much as you did before you were 20 years old? Think about that for a while. Is there a marked difference in your desire to learn? Is there a gap between how much you think you’ve learned before you were 20 and after? 

According to the World Health Organization, by 2020, people 60 years and older will outnumber children younger than 5 years old. Around the world, institutions, public or private, are rallying for lifelong learning, and rightly so. There have been evidence that continuous learning (in the form of a job or education, or even leisure activity) as well as regular physical activity throughout one’s life shores up “brain reserves” to fight aging-associated decline and even Alzheimer’s.  

What these “brain reserves” in the brain look like and in what form and how exactly they come about is still largely a mystery. It is like Dark Energy – we just know it exists even if we cannot see it because everything else will not make sense without it. For instance, there are people whose brains after they have died were found to have Alzheimer’s telltale marks, symptoms that they did not exhibit while they were living. Some scientists think that these may be tied to “brain reserves.”

A new study that came out recently in the Proceedings of the National Academy of the US (PNAS) may punch even just a small hole in this mystery to let a little bit of light come in.  

The study could, at first look, be shattering for those of us 20 years old or over. It found that your intelligence (technically termed as general cognitive ability or GCA) in late adulthood is largely predicted by what your intelligence was by the time you were 20 years old. This GCA covers these domains in the way your brain negotiates with the world and with itself: abstract reasoning, episodic memory, processing speed, verbal fluency, visual-spatial ability, working memory, and executive.  

Scientists also looked at the cortex, the “thinking” part of the brain (think “Mission Control”) to see how it related to GCA at 20. They are not saying that the cortical surface is the exclusive coveted nest of our “brain reserves,” but given what they know about the activity that happens here that are directly related to intelligence, this neural terrain is a very good place to look for it, or parts of it. They found that GCA at 20 was associated with the surface area (not the thickness) of the cortex, while “lifetime education” had no relation to the cortical surface area. 

GCA at 20 was not the only predictor, of course. There were other things like occupation, education, and lifelong experiences. But all these, based on the study, had only single-digit influence on later-life intelligence. It was one’s intelligence at 20 years old that had as much as 20% to 40% effect on one’s GCA at around 60.  

So what does this mean? Does it mean that all we do to make ourselves better after 20 will amount to practically nothing? Does it mean that all the efforts, personal and professional, that bank on learning after 20 will just be sort of  mere “palliatives” to the brain decay that we are all generally bound for?  

No. It does not mean we abandon our brains after 20. First, the study was on American males only, since the study had to make use of data that pre-existed for other purposes and they had to compare GCAS at 20 and at around 60. That alone gives you an idea of its limitations, but in science, that is a start.

The study reveals, yet again, that learning is more complex than just going to school, getting a job, and playing Sudoku or crossword puzzles to keep your mind together later in life. It means that there is no clear-cut prescription that will keep your mind from receding from yourself in your third decades and later, but there are clues as to which may influence it more. Here, they seem to point to GCA at 20 as having a far greater influence in your later life than your education or job.

It also means that the biology of our brains before we are 20 shapes our brains in lasting ways. Overwhelming studies on children have framed the years 0-5 as the age of the great wiring where they learn things deeply, including a sense of morality largely shaped by empathy. The wiring during adolescent years are marked more in terms of speed and the scope of the things we can learn at this stage. So it should be no surprise if a large proportion of these “brain reserves” are built-up before you are 20. The researchers also offered an explanation that perhaps, if by the time you are 20, you have really made a mental pact with learning, then that is the one thing that will drive you to find an interesting job or engage in other activities that keep your GCA later in life. 

But this does not mean that other sources for your “brain reserves” are useless to tap. While generic, lifelong activities like education and jobs were part of the study, it did not investigate the differences in these kinds of activities in terms of depth, intensity, or even frequency. These could play a larger role than what has been found in the study.

Also, a commentary on the study rightly pointed out that there are experiences later in life that could “deduct” as much as “add” to one’s intelligence. The study assumed that lifelong experiences unequivocally add “intelligence points” when obviously, there are mental and physical experiences such as traumatic ones that could diminish one’s GCA. 

I think one of the fundamental “walls” we run into – scientifically or not – when we tackle intelligence is that there is really no fool-proof way of measuring it faithfully. People, brain experts or otherwise, cannot even agree on a definition of Intelligence. IQ and other skill tests measure only how we score in those items they test for, but in no way do they tell you about the absolute value of an individual. I still do not think that artificial intelligence could ever measure the natural complexity of a human being’s idiocy or intelligence. – Rappler.com

Maria Isabel Garcia is a science writer. She has written two books, “Science Solitaire” and “Twenty One Grams of Spirit and Seven Ounces of Desire.” You can reach her at sciencesolitaire@gmail.com.

Add a comment

Sort by

There are no comments yet. Add your comment to start the conversation.

Summarize this article with AI

How does this make you feel?

Loading
Download the Rappler App!