Top Links
Logo
 

 

Improving assessment of web content - a research-based approach

We report few research studies in detail, but sometimes the methodology of an applied study offers a concrete example that may be adapted to a classroom setting. Wiley and colleagues (Wiley, Goldman, Graesser, Sanchez, Ash & Hemmerich, 2009) were interested in how college students would use search results to complete a writing assignment. Specifically, they asked students to "Write an explanation of what caused the eruption of Mt. St. Helens" after reviewing a number of web sites that supposedly resulted from a search on the phrase "causes volcanic eruptions". As is sometimes the case in educational or psychological research, the list of web links provided research participants were not the top "hits" from an actual search, but a collection of informative and pseudoscience sites offering information on volcanic eruptions. The idea was to see how the research participants would respond to resources of varying "quality". For example, an astrology site was included proposing that eruptions resulted from the alignment of stars and planets. The researchers proposed that the resources provided represented important conditions of authentic information literacy tasks - a) a complete explanation requires the combination of information from multiple sources and b) not all sources returned from as search provide valid information. Performance on this task actually provided the dependent variables (the outcome data) for this experiment and this exercise was completed some days after alternative forms of a training exercise.

During the training phase of this experiment, these students were presented with a task that asked them to determine which of several sites would be most useful for evaluating "whether the Atkins low carbohydrate diet is a healthy or harmful diet". In this case, the sites supposedly resulted from a search for "low carbohydrate diets", but were again selected to represent a range of quality. Here is how the "treatment" group differed from the "control" group. Without guidance, the control participants explored the sites and rank ordered them from most to least reliable and offered justification. In contrast, the treatment group were given guidance in how to evaluate the sites. They were told to see if they could determine who authored the content of each site, to determine if the author was likely to be knowledgeable about the topic, and to consider what the author's motivation might have been for authoring the site. There were asked to determine if the evidence provided was scientific and if it seems to be consistent across sites. They were asked if what they encountered on each site seemed to fit what they already knew. Like the control participants, those in the treatment group then ranked the quality of the sites with justification. They then received a ranking of the same sites supposedly generated by experts and were asked to speculate about discrepancies between their ranking and the ranking of the experts.

The treatment and control groups were compared on how they examined the sites associated with the Mt. St. Helen's task, their rankings of the reliability of the sites, and on the information quality of the essay they wrote to explain the eruption. The treatment group was more consistent in their ranking of the site quality. This may not be that surprising because they were likely to be using similar criteria. Time spent on reliable and unreliable sites did not differ across treatments. The treatment group wrote essays including more core causes and fewer incorrect causes.

The research study I have just described was conducted with college students. Perhaps we can offer you a taste of the experience. The links below were included:

  1. Efficacy and Safety of Low Carbohydrate Diets
  2. Count calories, not carbs
  3. Atkins Diet Alert
  4. The Atkins Weight Loss Program

[Note - 1, 2, 3 & 5 (4 & 6 seem no longer to be available) - this could easily be set up as a task by reordering the list and asking students to provide their own ranking and then offer a way check the ranking used in the study]

Do you think there are really correct answers for a task such as this? We would classify such tasks as exercises in critical thinking and propose that the quality of critical thinking is evident in the quality of the process. In other words, it is the quality of the argument offered in support of the rankings that demonstrates critical thinking. One can determine if sound criteria have been applied as justification. Depending on the tasks, it may be very difficult to determine what a correct answer or identify truth. However, the goal is to prepare students to think carefully about the resources they explore and such careful thinking would be evident in the rationale given for the rank of the sites investigated.

We include this brief description of a complex research project for several reasons:

  1. First, while the treatment in this research project consists of evaluation suggestions that are very similar to those made by many others, the demonstration that college students who practiced these specific suggestions could transfer the application of the strategies to a new task and that the application of the strategies would improve performance on a realistic web inquiry project is impressive. There is plenty of practical advice on how to improve student use of web resources, but there is very little research that the application of these strategies makes a difference.
  2. Second, the methodology used here seems to offer a very practical model for a classroom activity.

We challenge you to create a similar exercise for a content area you intend to teach. Consider how you might go about doing this. What would make a good topic? Consider some topics that have recently received a great deal of media attention, for which there are strongly held and perhaps self-serving views, and for which some advocates seem willing to take extreme positions that might be interpreted as bending the truth a bit to be persuasive. Topics like global warming and health care reform come to mind as I propose this exercise, but different topics may receive the same public attention in a few months or years. Could you collect a small number of web sites focused on the utility of a specific policy proposal such as cap and trade? What would make good web sites to mimic the task the researchers used? I would attempt to locate sites that accurately described the strengths of different positions, but also sites that were self-serving or simply inaccurate. What specific questions would you have students answer in evaluating the sites? How would the exercise be structured? Perhaps a small team approach in which the team evaluates each site using the scaffolding questions and then creates a ranking of the sites with justification. Teams might then compare and perhaps debate their rankings in a forum involving the entire class. How much time would you think an exercise of this type of take from start to finish?

Try a short list on the topic of "environmental potential of clean coal":

This is reality

Coal is clean

Clean

We encourage you to explore Susan Beck's pdf titled "The good, the bad, and the ugly: Why it is a good idea to evaluate web resources". You will see that it takes a similar approach to the research method we described and to the information literacy task we propose that you can develop.

We encourage you to share your task, links, and an evaluation rubric arguing for a specific ranking of web sites.

Return to chapter resources

 
About | Outline | Copyright
about.html outline.html copyright.html