I’ve observed an interesting divide in how people react to and interpret the term “the science of reading” (or “SOR” for short).
For some, the term elicits eager head nodding — it’s even become incorporated into the sales pitch of many a vendor of education products. For others, the term elicits a gut reaction akin to disgust.
There’s a lot wrapped up in how someone may think of “science” at large that then influences their reactions to the term of the “science of reading.” But don’t just take my word for it. Keith and Paula Stanovich penned some really insightful pieces about this in the early 2000s, and outlined how educators can understand and leverage science to inform their own instructional practice.
I’m going to tackle two pieces by them in two different blog posts. In Part I here, we will tackle the “Styles of Science,” from a 2003 piece by Keith Stanovich titled “Understanding the Styles of Science in the Study of Reading” in the Scientific Studies of Reading.
A warm hat tip to Chris Schatschneider on Twitter for this article!
In this article, Stanovich lays out 5 “styles” of science that colors how the overall term is understood.
- Correspondence vs Coherence
- Analytic Reductionism vs Holism
- Probabilistic Prediction vs Case-based Approach
- Robust-Process Explanations vs Actual-Sequence Explanations
- Consilience vs Uniqueness
Not the pithiest breakdown in the world, but there’s some great quotes in here, as well as some useful frames for understanding perspectives on science.
Correspondence vs Coherence
This may be the most useful distinction.
A correspondence view is the truly scientific one: it means using objective data to form and test theories, which is the basis for the scientific method. A coherence view, on the other hand, is our innate predilection to create and respond to compelling narratives about the world.
When it comes to reading research, consider how initial theories were coherence based — they established a narrative that fit the observational data that reading is primarily a “visual” skill in which whole words are recognized. But counterintuitively, empirical testing that better corresponded to reading has clearly demonstrated that reading is primarily a rapid connection of letter and letter-groups to their sounds.
Ken Goodman’s entire oeuvre was based on a coherence approach — and his narratives continue to retain a strong grip on educators. And arguments about advanced phonemic awareness continue apace as we speak — much of which is currently based more on an overarching theory and synthesis of research rather than on corresponding empirical studies.
On what a “correspondence” perspective entails:
“Simply, there is a real world out there that exists independently of our beliefs about it, researchers form theories about this world, and the theories that track the world best are closer to the truth and are thus a better basis for action. This is why planes don’t fall out of the sky, why bridges rarely collapse, and why my headache medication works more often than not.”
On what a “coherence” perspective entails:
“many in the qualitative research communities emphasize constructivist principles that put stress on the requirement that beliefs fit together in a reasonably logical way—the so-called coherence theories of truth.”
. . . “numerous authors have written about how the coherence doctrine, by linking itself with ecumenical notions such as tolerance and personal validation, obscures its uglier aspects. What has been obscured is how indiscriminate belief validation, with no check in external reality, creates a world that most of us would consider a nightmare. In this world, the witnesses and evidence in a jury trial are not sifted as to credibility because any piece of evidence put forward is equal to any other for the reason that all are valid by someone’s perspective.”
And on the resulting clash between these two views:
“. . . an extreme adherence to a correspondence theory of truth often necessitates the frustration of the strong human need for narrative coherence in explanation.”
“the explanatory frameworks that are generated by scientists working in the correspondence framework may not seem plausible to those who value coherence more, particularly the type of coherence that resonates with the narratives inherent in folk psychology.”
Analytic Reductionism vs Holism
While Stanovich makes it clear that correspondence more clearly aligns with science than coherence, some of these other styles outlined rely on a healthy balance, such as here in analytic reductionism vs holism. Both views have their downfalls, and both are needed as a complement to the other.
Analytic reductionism means breaking things down into small parts that can be studied individually — this is the bread and butter of empirical research. Holism, on the other hand, means viewing small parts within their greater context and whole. I’ve spent a fair amount of time arguing the need to view individual schools from a more holistic view, and I currently spent much of my time (in my day job) arguing about the need to view individual student data from a more holistic view, as well. But as Stanovich argues here, the translation of reading science to the classroom benefits from both approaches applied judiciously:
“Gone are the days when such investigations were couched as if comparing a disembodied mind interacting with a disembodied orthography. Investigators in this area appreciate the necessity of adding the learning environment and instructional context as interacting factors in the model of orthography and achievement links that is being developed. This area displays an additive holism rather than the subtractive holism that has soured so many scientists on that end of the analytic and holistic continuum.”
Probabilistic Prediction vs Case-based Approach
I found this distinction really useful as well, and very much related to the correspondence vs coherence dynamic. Researchers who conduct empirical studies will be more accustomed to probabilistic thinking, but for most of us laypeople, we are compelled by case studies.
“We in the behavioral science of the study of reading are so accustomed to probabilistic explanation and prediction that we are prone to forget how alien it seems to the layperson, to teachers, and indeed to some working in our own field.”
Stanovich makes a useful distinction in terms of understanding reading research, which is to understand that probabilistic prediction applies to aggregate data and decision-making, and that outlier cases will always be found — which can lead to some educators writing off the implications of aggregate data.
That said, both approaches have their utility and need to be wielded strategically:
“In many cases though, in the complicated field of reading—in its domains of application—it really is unclear whether we should be in probabilistic or
Robust-Process Explanations vs Actual-Sequence Explanations
This is another one in which both styles have an important function in translating reading science into practice. Robust-process explanations are based on theoretical principles which can apply across different examples, whereas actual-sequence explanations apply to specific cases that have happened.
“Those of us studying the psychology of the reading process are often after robust-process explanations, whereas we often address audiences who are interested in and oriented toward actual-sequence explanations. Teachers often want to know how this particular child reached this level of school achievement or this level of reading difficulty, as the case may be.”
Both of these are useful ways of getting at the truth.
“By subdividing robust-process explanations we get closer to actual-sequence explanations, and by aggregating actual-sequence explanations we get closer to a robust-process explanation. The two work in concert.”
Consilience vs Uniqueness
E.O. Wilson defines consilience as the “unification of knowledge by the linking of facts and fact-based theory across disciplines to create a common groundwork of explanation.” Such efforts are important in reading research, as robust theories draw upon different fields such as “connectionist modeling, cognitive neuroscience, and classroom studies of
Uniqueness, on the other hand, refers to a tendency to look for the flashy new thing that stands out and excites people.
“This concern for consilience contrasts with the faddish tendencies in the field of education to search for magic bullets and miracle cures deriving from theories that do not cohere with the knowledge being developed by allied disciplines. The quest for a magic bullet always tempts education to stray from valuing consilience.”
Like correspondence vs coherence, Stanovich positions consilience as far more firmly on the side of science.
What styles are scientific?
Somewhat unscientifically, since none of these “styles” are based on quantitative research, Stanovich guesstimates how these different styles can be applied in varying ways and still be considered scientific. As you can see, the two styles that present that greatest challenge, in his view, to the application of reading science are the tendency of many in the education field to lean into coherence and uniqueness in their views of reading.
That said, Stanovich is not trying to enforce a rigid perspective of science here — he believes these other styles can work together, and he also warns against rigidity — which is a warning I think many who would consider themselves “Science of Reading” folks would do well to heed:
“I do fear that some would prefer rigid rules and bullet points on a PowerPoint presentation rather than the story of science in its full complexity—including the complexities of certain styles that are neither right nor wrong but represent continua (better viewed as parameters that we are constantly adjusting so as to facilitate the process). Science is a delicate epistemological game.”
“Science’s real uniqueness comes from its self-correcting nature. Its unique epistemic power comes from a very un-Promethean characteristic: its constant fiddling with things—with theory, experimental setups, techniques, and its styles of the type I have discussed. And we are not afraid to readjust—which itself, recursively, is one of science’s characteristic features—we are not afraid to implicitly admit previous error when we make a readjustment.”
“I fear that this flexibility, this juggling, this self-corrective mindset will be lost if we too rigidly reduce science to a set of rules. Do not misunderstand me, though—I am wholeheartedly in favor of instructing teachers and other educational personnel in what science is and is not and what are the unique features that underlie its epistemic power. But these features should not become a prison.“
There’s been a lot of recent “scholarly disputes” about reading research, and Stanovich raises a procedure that would be nice to see used more: “adversarial collaboration.” In this process, the disputants agree to an arbiter, who then works with the disputants to design experiments that can test the theories under dispute. This research is then published by the arbiter with the disputants as co-authors.
What a brilliant way to tackle some of the current debates about phonemic awareness!
“In the middle of the heat of the internecine warfare that sometimes takes place in our field, it is important to be able to differentiate legitimate scientific criticism that derives from background assumptions from the opposite ends of the style dimensions I have discussed here—and criticisms deriving from critics who are not playing the game of science at all.”
“Willingness to accept offers of adversarial collaboration might be a tool to use in distinguishing who is playing science from who is not.Tweet