Fake News Fitness:
Against Disinformation

Misinfo of the Month:
Growth of Antisemitism on Twitter

The images at right are an evolving meme. Here's where it started, and where it got to for today's purposes, via a tineye search. A more recent "Pegasus" version (less piquant, more fun) is here.

Like many other technoscenti, I deactived my Twitter account on the day Elon Musk purchased the platform, and using an open source tool migrated my followed list over to Mastodon (NYTimes), which is a network of open source social network services.

The main difference between Mastodon and Twitter is there is no "algorithm" being tweaked by a company trying to maximize engagement for profit. You see what those you follow write, in the order they write it. It is what most imagine social media to be before they watch The Social Dilemma for example.

Of course, the motivations behind social media algorithms, like all corporate media, extend beyond mere engagement for profit, and with Musk's takeover, the skew towards right-wing hate speech, particularly racism and anti-semitism, has had an instant increase (NYTime), as was predicted from his signalling.

The difference between the sensemaking process (the first five of the panels at right) and misinformation (the last panel) is whether personal bias and intentional algorithmic boosting finds and reinforces unsupported patterns in data. Beware, tweeters.

Last Month's Article:
The 2,000 Mules Movie

When I showed the trailer at left to students who were already thinking about misinformation and asked whether they thought there "had to be something there", most hands went up. I could talk them through the fallacies, but on their own, what would they have made of this? Research shows they would be far more skeptical after Stanford Civic Online Reasoning's Evaluating Videos Lesson and Assessment.

Can We Teach the TikTok Generation to Think Straight about Media Messages?

Misinformation in the US drives inadequate responses to climate change, interferes with adoption of vaccinations and masks, and organizes astroturf rebellions that threaten democracy. Yellow journalism is not new, but disinformation from all sides at all times is much more pervasive and harmful now, and it is killing us. The primary source of disinformation in the US is political: the creation of "alternative facts" that are then promoted by social and broadcast media affiliated (formally or informally) with political parties (Kahne and Bowyer, 2017).

The first barrier to disinformation detection is tribal alliance. By the time we reach adulthood, we are likely affiliated with political parties that predispose us to trust some positions or sources and mistrust others, bypassing critical thinking. This is understandable, as it feeds the human need to affiliate with power groups, but polarization has dramatically increased in recent years (first from the backlash to Barack Obama's presidency, next from Donald Trump's adept use of social media as a bully pulpit). To fight this tendency to automatically align beliefs with current party positions ("blue no matter who"), we need to provide students with media literacy education before they adopt political affiliations.

There is a second barrier to fact-checking as well: bypassing critical thinking meets the brain's preference to avoid having to work harder than it thinks it needs to. As we scroll on social media, we like, comment and share what is unexpected (often, disinformation) much more readily than what we expect. This “System 1” (Kahneman) impulse to rubberneck on social feeds drives social media to boost misinformation in a vicious cycle. How can we instill personal values that involve that extra effort, based on the individual and societal impacts of disinformation?

Helping students resist the urge to accept polarized media messages at face value requires developing knowledge, skills and attitudes of "System 2" critical thinking, and also a values adoption / behavior change component: instilling a commitment to remember and practice. Students may need a “scared straight” introduction to magnitude and pervasiveness of disinformation in our polarized society and its debilitating effects, which are easy to find in the news every day.

This approach is similar to that taken in drivers education courses: showing gory accident photos to instill the intention not to text and drive. It includes inoculation theory (expose students to disinformation's many forms so they can recognize it) but goes beyond it: asking teachers to use current events to speak directly, with passion, about what is happening to our world because people don't know how to recognize or value reliable sources. We know that recent exposure to fact checks block the spread of disinformation,. We just need to instill values and practices in high schoolers to seek or perform them.

The Task

Combating misinformation is an urgent, worthy goal, but there are rarely set-aside places in K-12 curricula for it. Schoolteachers who recognize the threat of misinformation need to find their own ways to help students learn to engage critical thinking before they share what interests them on social media, and to commit to those mental efforts for everyone’s well being. When teachers include misinformation detection in a middle school elective (aka “special”, like music or art), or as part of a unit or project, they must make the most of a narrow window

What curricula, materials and approaches should teachers draw from use to convey the skills, knowledge and attitudes needed to combat misinformation? How should these choices vary across grade levels, and which will be the best fits for a given school and classroom environment?

Our Solution

Fake News Fitness, a Google Chrome Extension, is an instructional aid to help combat misinformation. It is not meant as a go-to tool for daily use, but as an initial training experience for students building mental maps of the web ecosystem of news media, advocacy organizations, government agencies and bloggers of all kinds. Building such maps is part of media literacy, but there are many others just as important, or even moreso.

There are many curricula and materials that address media literacy, with components that concern misinformation detection. Teachers who do not specialize in media literacy need help selecting from these to create units and practices that are a good fit for their classrooms’ grade level, curricular context, and culture. Recognizing the need for such guidance, the Fake News Fitness developers offer a framework for creating a la carte media literacy instruction, drawing on experiential learning, social learning, and skills training.

Our intention is to position the Fake News Fitness Extension as a helper app used in media literacy education. From our own research and practice, we offer a framework for combining approaches and models to combat the spread of misinformation the way drivers education combats accidents. We also identify particular products that exemplify the best of those approaches.

Quick Links

The Fake News Fitness Extension

Trust Me (Belic / Getting Better Foundation)

The Social Dilemma (Orlowski / Netflix)

Checkology (News Literacy Project)

Digital Citizenship Curriculum K-12

Stanford Civic Online Reasoning (COR)