Writing to Learn Research – Messy

Writing to learn is one of those topics that keeps drawing my attention. I have an interest in what can be done to encourage learning and approach this interest by focusing on external tasks that have the potential to manipulate the internal cognitive (thinking) behavior of learners. My background in taking this perspective is that of an educational psychologist with a cognitive perspective. I have a specific interest in areas such as study behavior trying to understand what an educator or instructional designer can do to promote experiences that will help learners be more successful. The challenge seems obvious – you cannot learn for someone else, but you may be able to create tasks that when added to exposure to sources of information encourage productive “processing” of those experiences. We can ask questions to encourage thinking. We can engage students in discussions that generate thinking through interaction. We can assign tasks that require the use of information. Writing would be an example of such an assigned task. 

Writing to Learn

Writing to learn fits with this position of an external task that would seem to encourage certain internal behaviors. To be clear, external tasks cannot control internal behavior. Only the individual learner can control what they think about and how they think about something, but for learners willing to engage with an external activity that activity may change the likelihood productive mental behaviors are activated.

I found the summary of the cognitive benefits of writing to learn useful and consistent with many of my own way of thinking about other learning strategies – external tasks that encourage productive internal behaviors. Writing based on content to be learned requires that the writer generate a personalized concrete representation at the “point of utterance”. I like this expression. To me, it is a clever way of saying that when you stare at the screen or the empty sheet of paper and must fill the void you can no longer fool yourself – you either generate something or you don’t. You must use what you know and how you interpret the experiences that supposedly have changed what you know to produce an external representation.

To produce an external product, you must think about what you already know in a way that brings existing ideas into consciousness (working memory) by following the connections activated by the writing task and newly acquired information. This forces processing that may not have occurred without the external task. Connections between existing knowledge and new information are not necessarily made just because both exist in storage. Using knowledge to write or to perform other acts of application encourages making connections.

Such attempts at integration may or may not be successful. Having something external to consider offers the secondary benefit of forced metacognition. Does what I wrote really make sense? Do the ideas hang together or do I need to rethink what I have said? Does what I have proposed fit with the life experiences (episodic memory) I have had? 

Writing ends up as a generative process that potentially creates understanding and feeds the product of this understanding back into storage.

Graham, Kiuhara & MacKay, M. (2020)

In carefully evaluating and combining the results of many studies of writing to learn, these researchers intended not only to determine if the impact of writing to learn had the intended general benefit but to use the variability of writing tasks and outcomes from studies to deepen our understanding of how writing to learn encouraged learning. Surely, some activities would be more beneficial than others because of the skills and existing knowledge of learners or the specifics of the assigned writing tasks. So, the meta-analysis is asking if there is a general effect (Is writing to learn effective), and secondarily are there significant moderator variables that may help potential practitioners decide when, with whom, and how to structure writing to learn activities?

The Graham and colleagues’ research focused only on K12 learners. Potential moderator variables included grade level, content area (science, social studies, mathematics), type of writing task (argumentation, informational writing, narrative), and some others. I have a specific interest in argumentation () which is relevant here as a variable differentiating the studies because it requires a deeper level of analysis than say a more basic summary of what has been learned. 

Overall, the meta-analysis demonstrated a general benefit for writing to learn (Effect size = .30). This level of impact is considered on the low end of a moderate effect. Graham and colleagues point out that the various individual studies included in the study generated great variability. A number of the studies demonstrated negative outcomes meaning in those studies the control condition performed better than the group spending time on writing to learn. The authors propose that this variability is informative as it cannot be assumed that any approach with this label will be productive. The variability also suggests that the moderator variables may reveal important insights.

Unfortunately, the moderator variables did not achieve the level of impact necessary to argue for useful insights as to how writing to learn works or who is most likely to be a priority group for this type of activity. Grade level was not significant. The topic area was not significant. The type of writing task was not significant. 

Part of the challenge here is having enough studies focused on a given approach with enough consistency of outcomes to allow statistical certainty in arguing for a clear conclusion. Studies that involved taking a position and supporting that position (e.g., argumentation) produced a much larger effect size, but the statistical method of meta-analysis did not reach the level at which a certain outcome could be claimed. 

One interesting observation from the study caught my attention. While writing to learn is used more frequently in social studies classrooms, the number of research studies associated with each content areas was the smallest for social studies. Think about this. Why? I wonder if the preoccupation of researchers and funding organizations with STEM is responsible. 

More research is needed. I know practitioners and the general public get tired of being told this, but what else can you recommend when confronted with the messiness of much educational research? When you take ideas out of carefully controlled laboratories and try to test them in applied settings the results here are fairly typical. Humans left to their own devices as implementers of procedures and reactors to interventions are all over the place. Certainly, the basic carefully controlled research and the general outcome of meta-analysis focused on writing to learn implementation are encouraging, but as the authors suggest the variability in effectiveness means something, and further exploration is warranted.

Reference

Graham, S., Kiuhara, S. A., & MacKay, M. (2020). The effects of writing on learning in science, social studies, and mathematics: A meta-analysis. Review of Educational Research90(2), 179-226.

Loading

One thought on “Writing to Learn Research – Messy”

Leave a Reply