Top Links
Logo
 

 

Improving Project-Based Learning

Practitioners are likely frustrated by the typical findings of educational researchers sometimes demonstrating a tactic works and sometimes that it does not. The interest in what are sometimes described as constructionist tasks (e.g., problem-based and project-based learning) as an alternative to direct instruction is a good example of this situation and the results in this case demonstrate not just the lack of a significant difference but claims that both approaches are superior to the other. However, in this case, careful examination of the many existing studies provides a possible explanation. One of the best data-based examinations of this type we have encountered is a meta-analysis conducted by Alfieri and colleagues (Alfieri, et al., 2011). These researchers conducted two meta-analyses. The first contrasted unassisted discovery experiences with direct instruction and the second contrasted enhanced discovery experiences with direct instruction. The outcomes match the conflicting positions that have been argued. The studies involving unassisted discovery favor direct instruction. The studies involving enhanced discovery favor the constructionist tasks. This is helpful.

The next step seems obvious. Just what do the researchers claim constitutes an enhanced discovery experience? I can paraphrase what the researchers say but you may feel you need something more concrete or task specific. The research identified three categories of enhanced discovery - generation, elicited explanation, and guided discovery. Outcomes could be further broken down by these three categies. Generation, the one treatment that did not generate a significant advantage, required learners to generate rules, strategies, images or to answer experimenter questions. The explanation task required that the learner explain some aspect of the outcome task or the content. Guided discovery involved task related feedback at several points during the assigned task.

The researchers identify the individual studies that were included in the meta-analysis but this list does not indicate which of the studies were classified as an example of which category.

A secondary finding that K-12 educators will probably find disturbing was that the adolescent age group were least likely to benefit from the enhanced discovery experiences. The researchers speculated that motivation may be an issue with this group and these methods.

A recent meta-analysis (Lazonder & Harmsen, 2016) might be described as taking this lack of specificity as a starting point. Their analysis, while mostly considering learning by conducting scientific experiments, offers some more concrete suggestions. Conducting experiments without guidance typically resulted in some predictable limitations - it proved difficult for learners to generate multiple hypotheses to evaluate, to design experiments that do a good job of testing a specific hypothesis, and given the results of an experiment to clearly state just what had been learned. Guidance might involve requesting that multiple hypotheses by listed, requesting the description of what would provide an uncontaminated test of an hypothesis, and asking whether the hypothesis should be accepted or rejected given the results of the experiment.

A reading of some of the studies included in the Lazonder & Harmsen (2016) offers some specific ideas. For example, the researchers refer to the value of a "process worksheet" (de Vries, et. al, 2008 was one study offered as an example) . Alfieri, Brooks, Aldrich and Tenenbaum (2011) focused on the value of periodic checkpoints with feedback. This combination (process worksheet that outlined the task and requested some type of active reflection at key points) reminded me of the chemistry lab workbooks I remember from college. The workbook would structure what I was to do and as I made my way through the task would periodically request that I make an observation or comment on something I had just done.

A similar scaffolded approach is advocated by authors Highfill, Hilton and Landis in their book "The hyperdoc handbook" (2016). These authors propose that educators use technology to guide students through the discovery process and offer some lesson templates that can be found online. Some of these templates involve step by step guidance and require embedded learner actions.

It would not be appropriate to leave you with the impression that a process worksheet with embedded checkpoints is the only way to implement guidance. It does seem a useful way to communicate the ideas of structural support and frequent checks on the understanding of purpose and outcome. Note that just "filling in" such a structure would not fulfill the expection of feedback. Certainly an active request for comment may encourage the self monitoring involved in metacognition, but external intervention may also be necessary when such externalizations of understanding indicate a student has missed the point.

 

Alfieri, L., Brooks, P. J., Aldrich, N. J., & Tenenbaum, H. R. (2011). Does discovery-based instruction enhance learning?. Journal of Educational Psychology, 103(1), 1-18.

de Vries, B., et al. (2008). "Supporting reflective web searching in elementary schools." Computers in Human Behavior 24(3): 649-665.

Highfill, L., Hilton, K. and Landis, S. (2016). The hyperdoc handbook: Digital lesson design using Google apps. EdTechTeam Press.

Lazonder, A. W. and R. Harmsen (2016). "Meta-Analysis of Inquiry-Based Learning: Effects of Guidance." Review of Educational Research 86(3): 681-718.

Return to chapter resources

 

 

About | Outline | Copyright
about.html outline.html copyright.html