The definitive version of this post was originally published on December 23, 2014 on the PLOS Neuroscience Community website, where I serve as an editor.
Ever since video games have become widely available, they have reflected a strong generational divide: most of today’s grandparents probably never played video games, whereas most of their grandchildren play them on a daily basis. Now, after recent scientific discoveries have revealed that video games might influence brain function for the better, many companies have started selling “brain games”, or computerized cognitive training programs, creating a market worth close to $1 billion per year.
What’s more, some of these companies are seeking the Food and Drug Administration’s approval to use these computer programs in healthy older adults to compensate the effects of aging on cognition. But there may be quite a long way to go before the sight of an elderly person bashing their handheld console in the clinic waiting room becomes daily routine: the neuroscience of video games and their cognitive impact is still in its infancy, and academic researchers in the field are warning that the promises made by some companies amount to quackery more than solid science. A new meta-analysis, recently published in PLOS Medicine, reviews the field and points out which types of brain games might work—and which might not.
A meta-analysis is a type of medical research article where scientists aggregate together the results of individual studies to assess whether a particular intervention has consistent effects across studies, and also to determine how large those effects are. This meta-analysis focused on the effects of computerized cognitive training in healthy older adults (roughly 60 years and older).
Better at What?
Studies were included if the participants were tested on cognitive tests both before and after the training. Importantly, those tests needed to be different than the ones trained in the brain games: we know that playing Sudoku makes you better at playing Sudoku, but the real question is whether it makes you better at something else, too. The type of computerized cognitive training in the studies varied widely, from studies that simply had participants play video games (Tetris, Rise of Nations or Medal of Honor were among the list) to custom-developed programs specifically designed to train one or several capacities such as working memory, attention, processing speed, verbal memory, visuospatial skills or executive functions.
Altogether, the authors identified 52 studies of sufficient quality to be included in the meta-analysis. Overall, they found that computerized cognitive training was associated with a significant, but very small improvement in cognitive performance. Most importantly, the authors offer a few pointers for further studies.
- First, because the improvement of performance brought on by computerized cognitive training is expected to be small, studies should be sufficiently powered, i.e. have enough participants (about 90 people would be the minimum).
- Also, group-based training had a positive effect on performance, whereas at-home training did not.
- Perhaps surprisingly, training between 1 and 3 times per week proved effective, but studies with more intensive training sessions did not, suggesting that the negative consequences of fatigue might offset the larger amount of time dedicated to practice.
- On the other hand, studies where each practice session was shorter than 30 minutes were negative.
Cognitive Improvement Not Tied to Working Memory
The researchers also found that the details of what the computer programs had the participants do were important. For instance, working memory is often thought of as our mental notepad, the limited quantity of information that we can keep in mind from one moment to the next (working memory is often assessed by having participants keep series of digits in memory for a few seconds. How good are you at keeping in memory a phone number that you just read? For most of us, 7 digits is the upper limit of our working memory capacity). Working memory is thus implicated in multiple aspects of cognition. Nevertheless, the meta-analysis revealed that training that specifically targeted working memory did not improve other cognitive functions.
As any scientific research project, meta-analyses have limitations, mostly related to the heterogeneity of the individual studies that they attempt to combine. Here, a major shortcoming concerned the fact that most studies did not assess whether the effects of computerized cognitive training lasted beyond the moments immediately following the practice session. Thus, the meta-analysis cannot answer the crucial question of whether “brain games” can have any lasting positive impact on cognition, let alone fend off the adverse effects of aging. Also, the potential benefits of computerized cognitive training were generally assessed only with psychology laboratory tests, leaving aside the burning question of whether any gain on those tests translates into progress in real-life situations such as remembering appointments or resisting distractions while driving a car.
Importantly, about half of the studies used “wait-lists” or other types of passive control groups (in a wait-list control group, the participants assigned to the control group were first run on the baseline cognitive tests and then put on a waiting list to receive the cognitive training at the end of the study). As pointed out in the comments to the article, passive control groups might have created artificially large differences with the intervention groups as opposed to active control groups, where participants were trained using something else than computer programs. Active control groups are generally considered better from a methodological standpoint, but are more time- and resource-consuming.
Consensus Paper Warns on ‘Unwarranted Enthusiasm of Brain Training Industry’
The meta-analysis is not the only one to temper the enthusiasm for “brain games”: a few weeks earlier, a large group of cognitive psychologists and neuroscientists, led by the Stanford Center on Longevity and the Berlin Max Planck Institute for Human Development, released a consensus paper on the evidence—or lack thereof—of the benefits of brain training. The consensus paper did not conduct a rigorous review of the existing literature, but because its authors were prominent scientists who know the state of the art of research inside out, the conclusions overlap with those of the meta-analysis to a very large extent.
Importantly, the authors of the consensus paper caution against the unwarranted enthusiasm of the “brain training industry” that massively overstates its products’ benefits. In their words, “the small, narrow, and fleeting advances [due to computerized cognitive training] are often billed as general and lasting improvements of mind and brain.” The consensus paper laments the exploitation by that industry of the understandable anxiety that older adults might have regarding the decline of their cognitive function.
All of this is not to say that computerized cognitive training has no effect whatsoever. Indeed, the meta-analysis does point to significant albeit small benefits. The authors of both the meta-analysis and the consensus paper suggest key points to improve the quality of future research to the highest scientific standards. To conclude, you don’t need to stop playing that Game Boy right now, but don’t forget to pause every once in a while and also make time for hiking, gardening, socializing, and so on—all of which will benefit your brain and mind just as much!
Lampit, A., Hallock, H., & Valenzuela, M. (2014). Computerized Cognitive Training in Cognitively Healthy Older Adults: A Systematic Review and Meta-Analysis of Effect Modifiers PLoS Medicine, 11 (11) DOI: 10.1371/journal.pmed.1001756