
Mike Evans knew something had to change.
As the lead instructor for American Government 1101 at Georgia State University in 2021, Evans had watched his students over the years show up with fewer facts and more conspiracy theories. Gone were the days when students arrived on campus with dim memories of high school civics. Now they came armed with bold, often misleading beliefs shaped by hours spent each day on TikTok, YouTube and Instagram.
One example of misinformation making the rounds back then was an anonymously posted video that more than half of teens in a national survey said provided “strong evidence” of U.S. voter fraud. The video was actually shot in Russia, crucial context that could be gleaned by entering a few choice keywords into a browser.
Ignoring the problem of online gullibility felt irresponsible – even negligent. How could the course deliver on its aim of helping students become “effective and responsible participants in American democracy” if it turned a blind eye to digital misinformation? At the same time, a major overhaul of a course that enrolls more than 4,000 students each year – with 15 instructors teaching 42 sections in person, online and in a hybrid format – would create a logistical nightmare.
That’s when Evans, a political scientist, came across the Civic Online Reasoning curriculum, developed by the research group I used to lead at Stanford University. The curriculum, which is freely available to anyone, teaches a set of strategies based on how professional fact-checkers evaluate online information.
In fall 2021, he reached out with a question: Could aspects of the curriculum be incorporated into American Government 1101 without turning the whole course on its head?
My team and I thought so.
Teaching informed citizenship
Evans’ challenge was hardly unique to his campus.
For Generation Z, born between 1997 and 2012, social media – especially YouTube, TikTok, Instagram and Snapchat – has become their source of information about the world, eclipsing traditional news outlets. In a survey of more than 1,000 young people ages 13 to 18, 8 in 10 said they encounter conspiracy theories in their social media feeds each week, yet only 39% reported receiving instruction in evaluating the claims they saw there.
We built our Civic Online Reasoning program to address this gap.
When we launched the program in 2018, digital literacy was a catchall that included everything from editing and uploading videos to cyberbullying and sexting. “Checking the credibility of sources” was just one criterion among many buried in a list of desired outcomes.
We narrowed the focus of our program to skills essential to being an informed citizen, such as “lateral reading” − that is, using the full context of the internet to judge the quality of a claim, identify the people or organizations behind it and assess their credibility. Rather than fixate solely on the message, we taught students to vet the messenger: What organizations stand behind the claim? Does the source of the claim have a conflict of interest? What are the source’s credentials or expertise?
We tested our approach in an experiment in 12th grade classrooms teaching government in Lincoln, Nebraska, public schools.
Across six hours of instruction – two hours less than the average teen spends online each day – students nearly doubled in their ability to locate quality information compared to a control group. We thought it wouldn’t be a huge leap to extend our approach to college classrooms.
In a version of this program modified for Evans’ course, we designed six short modules that could be used asynchronously, meaning that students could complete them on their own time, regardless of course format. Unlike information literacy lessons that soar above the particulars of any one discipline, our modules were closely tied to course content.
In a unit on the executive branch, for instance, students examined an Instagram video that falsely claimed President Joe Biden wanted Americans to pay more at the gas pump. In a module on the judiciary, they watched a video on TikTok about Ketanji Brown Jackson’s Supreme Court confirmation, posted by a partisan, left-leaning organization.
We created videos that pulled back the curtain by deconstructing tactics common in political campaigns – quotes ripped from context, videos spliced and selectively edited, and corporate-funded websites that masquerade as grassroots efforts.
We also taught students how to check facts like the pros. The main strategy was lateral reading – searching across the internet to see what other, more credible sources say about an organization or influencer. We challenged common assumptions too, such as that Wikipedia is always unreliable. Not true, especially for “protected pages,” indicated by a padlock icon at the top of an article, which prevent editorial changes except those made by established Wikipedians. Another is the belief that a dot-org website has passed rigorous tests that qualify it as a charity, which is never true. Dot-org has always been an “open” domain that anyone can register, no questions asked.
These lessons took just 150 minutes in total over the semester, and instructors didn’t need to change a thing; they just listed the lessons on the course schedule.
Positive outcomes, modest effort
Did this approach work for Evans and his American Government 1101 students?
Across two semesters in one academic year, 3,488 students took a test at the beginning of the course and again at the end. It included items such as one in which students evaluated a website that claimed it “does not represent any industry or political group” but is actually backed by fossil fuel interests.
In June, Evans, two co-authors and I uploaded a preprint of a journal article, which hasn’t yet been peer reviewed, that documents the experiment and its results. We found that from the beginning to the end of the semester, students became a lot smarter at identifying shady sources and more confident in evaluating where information comes from. Students’ scores showing how well they were able to do this improved by 18%. Even better, 80% said they “learned important things” from the modules.
Not bad for an easily adopted addition to the course.
These results add to other studies we’ve conducted, such as one in a college nutrition class and one in a rhetoric and writing intro course, that similarly showed how educators can improve students’ digital literacy – and their awareness of misinformation – without causing a major disruption to the curriculum.
And I believe it’s needed. A chasm separates the approved content that appears on students’ reading lists and the massive amount of unregulated, unverified and unreliable content they consume online.
The good news? This intervention could work in any subject where misinformation runs wild: history, nutrition, economics, biology and politics. Findings similar to ours from other college campuses buoy our confidence in the approach.
These changes don’t require waiting for a big revolution. Small steps can go a long way. And in a world flooded with misinformation, helping students learn to sort fact from fiction might be the most civic thing we can do.

Sam Wineburg received funding from the William & Flora Hewlett Foundation for this research. He is a board member of the not-for-profit Digital Inquiry Group (inquirygroup.org), which now operates the Civic Online Reasoning curriculum.
This article was originally published on The Conversation. Read the original article.