underlying paper by Jackson et al. (2011) makes no mention of random assignment of participants to different amounts of video game play, so there's no justification for inferring causality. (Additionally, it notes that the video-game players had lower GPAs.)
Second, they say that "it just so happens that surgeons benefit from video-game playing as well," citing a study by Rosser et al. (2007) that found that "surgeons who played video games for more than 3 hours a week made 37% fewer errors and were 27% faster in laparoscopic surgery and suturing drills compared to surgeons who never played video games." This is followed by speculation as to the mechanism by which game playing could cause these differences. However, the evidence of causation here is even weaker than for the study of children cited above. It's not even a longitudinal study—it's just a cross-sectional finding of an association between video game play and performance on (computerized) tests of surgical skill. Again, one need only read as far as the Rosser et al. abstract to find the statement "Video game skill correlates with laparoscopic surgical skills." There is no evidence of causality, but the textbook authors have said that surgeons "benefit" from playing video games.
Finally, the caption below the stock photo reinforces the thrust of the boxed text by asking "If video games can make you a better surgeon, what other ares of your life could playing video games improve?" As they say in courtroom dramas, "Objection! Assumes facts not in evidence."
Sure, this is a run-of-the-mill mistake that laypeople make all the time: Confusing evidence of correlation (video game playing co-occurring with increased spatial skill or surgical proficiency) for evidence of causation (playing video games making your spatial skills and surgical proficiency better than they were before). But this is a textbook on research methods in psychology. If the authors of such books have the proverbial "one job to do," it is teaching their readers what conclusions can be drawn from what kinds of evidence. That's what education in research methods is all about: learning to design research studies that have the power to permit certain inferences, and learning which inferences can and cannot logically follow from which designs. You can think of analogies to other fields—a nutrition book reversing the properties of carbohydrates and fat? An algebra textbook getting the quadratic formula wrong? A history book that confuses the Declaration of Independence for the Constitution? Correlation versus causation is not a nuance or side issue; it's at the heart of the behavioral science enterprise.
The authors of Discovering the Scientist Within must understand the distinction between correlation and causation, and I am sure they can generate the plausible alternative (non-causal) explanations for these video game results that I mention above. I know this because on page 30, in the paragraph immediately before the "Research Spotlight" box, they write, "often there is not a set direction of how one thing influences another ... News coverage, such as in cases of school shootings, often portrays playing video games as the cause of aggressive behavior. Yet it is equally likely that aggressive individuals gravitate toward violent video games" [emphasis added].
The fact that mistakes like this can turn up in a book meant to educate its readers to avoid them is remarkable, and I think it goes to show just how confounding sound causal inference can be for the human mind. As Daniel Simons and I argued in The Invisible Gorilla, human beings are susceptible to an "illusion of cause" that leads us to jump to particular causal conclusions in all kinds of situations where the evidence we have doesn't logically justify them—indeed, where other explanations are equally or even more likely, and where the assumption of causality can get us into big trouble. The ease with which we can generate mechanisms to explain a particular causal inference can contribute to the illusion. For example, being aware of the "neural plasticity" concept could make it seem more likely that intensive cognitive work (e.g., video-gaming) might "train" some more fundamental underlying cognitive capacity (e.g., spatial skill) or transfer to some other practical task (e.g., surgical proficiency). None of us, not even psychology professors who write textbooks on research methods, are immune to these fundamental thinking pitfalls.
Hopefully the second edition of Discovering the Scientist Within will correct these correlation/causation errors, as well as any other issues that may lurk in the text. My quick flip-through picked up one more passage the authors might want to think about rewriting:
... the author Malcolm Gladwell, a self-described "cover band for psychology," is known for his ability to summarize and synthesize psychological findings so that the general public can benefit from the exciting advances in knowledge that psychological researchers have made.Accompanying this sentence on page 41 is a photo of Gladwell's book Outliers. As many readers of this blog will know, I don't agree that the general public is benefitting from Malcolm Gladwell's writing, precisely because he doesn't summarize and synthesize as well as people think he does. Since correlation, causation, and statistical thinking are among the things Gladwell has difficulty with, it doesn't seem like a research methods textbook should be endorsing his work.