Building skills in evaluating sources is fundamental in today's information-rich world. With the proliferation of digital platforms and the ease of sharing information, the ability to discern trustworthy sources from unreliable ones is crucial. Educating individuals on how to assess sources for credibility, bias, and accuracy empowers them to form well-founded theses, make informed decisions, and contribute meaningfully to academic discourse. These skills will not only aid students throughout their academic careers but also translate into practical everyday life, enabling individuals to confidently and consciously navigate the complexities of an information-based society.
You can access the lesson plan through the tabs above. It takes about 45 minutes on average to conduct the lesson. Preparation includes reviewing the lesson plan and gathering materials such as printouts. If you would like any assistance in preparing for your library instruction, please feel free to schedule a consultation with John, Connor, or Kathleen by following the link in the column to the right.
A printable version of this lesson plan is available here.
Lesson Name & Topic:
This lesson plan is designed to aid students in cultivating the capacity to distinguish accurate information from biased or erroneous content.
|Students will further develop critical thinking skills by evaluating information sources utilizing methods such as lateral searching.
|45-minute class session
General timing: Slides 1-8, about 10 minutes, including 3 minutes for either the video (slide 3) or the class discussion (slide 4)]
Introduce yourself and ...
... quickly go over what you’ll be covering in class today
Information is created by people, and it comes with a certain point of view. Understanding these different viewpoints is critical to understanding and using the information we receive. We all evaluate bias and credibility all the time. When we choose which news sites to read or which Tweets to believe, we are thinking about the credibility of the source. When we ask for recommendations on which movies to see or which books to read, we select friends or colleagues whose opinions we trust. So to start, let’s talk about what credibility means to you.
We take a number of factors into consideration when judging the reliability of information. A source doesn’t have to be very reliable in order to be useful for your research ... as long as you realize that it’s unreliable, or that it’s aimed at a particular audience, you can employ that source in a manner appropriate for your purposes. For example, an individual post on the r/Coronavirus subreddit might contain unsubstantiated claims and yet still constitute an interesting bit of evidence about public misperceptions around the pandemic.
For many factors we commonly consider (like, the author is an “expert” or the source is “unbiased”), you need to step away from the source itself in order to really determine the answer. Research shows that the best and most accurate determinations about a source’s reliability come from an activity called lateral searching (or lateral reading), where you look for outside information about the source.
One way to help remember some of the techniques of lateral searching is through the mnemonic SIFT. [Refer to the SIFT handouts in their group’s packet so they can read along.]
Often, your first “sifting of information sources is just to put them into three piles (definitely not worth my attention, could be but with some caveats, generally credible) in order to determine which ones to spend more time with.
Some quick ways to laterally search for information about sources you encounter, on the Internet or even in the Libraries’ research databases:
[General timing: Slides 7-12, about 10 minutes, including 8 minutes for activity -- you’ll move through slides 9-13 very quickly]
Let me give an example.
Imagine that a close friend or family member is considering becoming a vegetarian. You don’t know much about the environmental, health, or ethical aspects of vegetarianism and are doing research to help your loved one make a carefully considered decision.
I came across this quick article on the omnivore diet in my first Google searches. It comes from a website called NutritionFacts.org. Great -- I know the name of the organization claiming responsibility for the content. This particular “Topic summary” is undated and credited only to a volunteer with only a first name, “Randy.” The summary says that “many clinical trials” have compared omnivore diets to plant-based ones, but all the links in this paragraph go to videos from NutritionFacts.org’s own video library.
Don’t spend too much more time diving into this source! I’m not going to look at the “about” page or any of the other content on the site yet -- first, I’m going to search laterally to get more information about this source.
|A Google search for nutritionfacts.org turns up a lot of content from nutritionfacts.org.
Try to eliminate those self-referential results by amending the search to
Most of the results that come up are still coming from NutritionFacts.org, from its social media accounts.
Hmmmm.... Some of those results reference a “Dr. Michael Greger”
Google his name ... a lot of the results are still controlled by him.
But the info box to the right of the screen links to a Wikipedia article on him.
Scan the table of contents for the Wikipedia article. There’s a section on “Reception” of his ideaa. It notes that two critics acknowledge his credentials but say he overstates evidence, has a “vegan agenda,” and cherry-picks his data.
While NutritionFacts.org doesn’t appear to be peddling misinformation, I would need to take the site’s clear bias into account before taking any information it provides at face value. This might be an excellent primary source example if I’m doing research about vegan vs omnivore agendas, but if I”m looking for something more even-handed I should probably discard this source and keep looking.
These first two parts of SIFT (Stop, Investigate) are often enough for a first, cursory evaluation of information sources.
Distribute the group packets (if it hasn’t already been done) and briefly explain the activity.
While the groups discuss the articles, draw a large (2-3 feet) version of the reliability scale on the whiteboard. Then, circulate around the room to see if there are questions and distribute the Corgi assessment sheets.
[General timing for slides 13-21: 10 minutes. During the group activity, you might want to hide the slides for sources your groups won’t cover.]
[GROUP 1. For each of the 6 (or 7) example slides, ask students who were NOT part of the small group that examined this source for an impressionistic ranking of the source’s reliability. Then, ask the group who had that source to give their initial reliability ranking, and at least one reason for it.]
[For this example, a Google search for Center for Consumer Freedom brings up a Wikipedia page with a “Criticism” section; also the name of the organization’s founder]
|[GROUP 2. This article is from a very respectable peer-reviewed academic journal with an impact factor of 12.78 that has been published since 1915 -- so, the article passed through review. However, students might dig deeper and realize that 1) there’s been response to this article published in PNAS with some objections; and 2) the study has garnered criticism for its assumptions and methodology, although some of the critics have clear agendas of their own. ]
|[GROUP 3. Obviously, an article from Wikipedia might be difficult to rank with the “just add wikipedia” trick ... but perhaps they will think to google something like “reliability wikipedia -site:wikipedia.org”. There’s also a mini editing war indicated by the revisions and reversions on the article’s View History tab and a heated discussion about the removal of a section calling fruitarianism a “fad diet” and how even handed the article is, or isn’t, on the Talk tab -- but those tabs don’t show always show up in the mobile version of Wikipedia. ]
|[GROUP 4. The New York Times is one of the most reputable newspapers in the US and is generally considered our “newspaper of record.” However, this article is from the opinion section, and the author, Jonathan Safran Foer, can be assessed for his general credibility and expertise.]
|[GROUP 5. Self-published book -- Google for the author’s name, and also for the publisher, “MedInform Publishing” ... not a lot out there, and some references under the author’s name to a “]
|[GROUP 6. MMWR is a well-established publication. The CDC’s reputation has taken a hit lately, and students might also point out that this particular article is 20 years old. Is “currency” the same thing as “reliability,” though?]
|Slide 19: [Hidden]
[GROUP 7. This extra slide for a 7th group is hidden in presentation mode unless you unhide it.]
[The HappyCow blog has a brief wikipedia article. This article’s references aren’t very solid, and the blogger’s credentials in the area of agricultural science are ... not very solid either]
This has been a great discussion! I’ll share the link to the slides with your instructor, so you can follow these links for more information about practicing SIFT in your research.
LINK TO EVALUATING SOURCES Guide, and maybe info about types of sources (Pop vs scholarly, Primary vs secondary) as well
And remember, the UW Libraries are here to help you! Academic research can be intimidating, but you can make an appointment for one-on-one help with the Odegaard Writing and Research Center.
[Please collect the packets of articles before you leave ... we need to reuse them to save paper! Tell students they can keep the SIFT handouts, though.]
Consult with a librarian about assignment design, ideas for information literacy activities, Canvas materials, and other ways to teach your students information research skills.