Here’s a simple activity you can try in your class: Have students execute a Google search that is a question. Then have the students look at the top five results, and using lateral reading pick the source that is most likely to be authoritative and the source they think is least authoritative. Have them talk through their reasoning.
For example, take this set of results to “Can magnets cure cancer?”:
We might note the first page comes from a fringe site that sells “therapeutic” magnets, and is quoting from an out-of-print magazine from the 1980s called “Magnets in Your Future” (that’s the name of the magazine, not the article). A similar critique could be made of the MagnetiCare product site.
On the other hand the two results on the bottom make a good pair. The scientific paper is cited by 25 papers, and shows in some lab conditions magnetic therapy (somewhat different than magnets) slowed progression of a specific cancer in mice. The Sloan Kettering article, however, says there is no evidence that supports the use of magnets to treat cancer. How do we weight these against one another?
It’s weird, because you would think what is reliable here would be obvious, but if you hear students talk through their thought process I guarantee you will hear things that surprise you. It may be obvious to you, for example, that a natural remedies website is less reliable than a research hospital at summarizing treatment efficacy, but to some students these may just be two different sites pitching different things — one magnets, another traditional care.
Likewise, a person with some health research literacy will know that the single study should not outweigh more general surveys of treatment efficacy, but many students will conceptualize this wrongly, not understanding that (in general) new studies serve to qualify old studies, not replace them.
You might also show the students how you think about choosing a search result yourself (assuming you’re doing it effectively!). Walk through what you do with “Does fracking cause earthquakes?”, and explain why you jump to the USGS link instead of the Forbes one, as well as why you might not see the USGS as the final word.
Need some questions to ask Google? I’ve written up over 300 for you to choose from, from ones about obvious hoaxes to ones that require a deep dive into areas where even experts conflict.
They are here.
I find pumping a wide variety of questions like this into Google helps me think about what students need when flipping through search results. Let me know if they help you do the same. Many are chosen from trending stories and a lot were generated by playing around with Google auto-suggest. I’d love at some point to rate the questions for difficulty, but right now they are just alphabetical.