How social media distorts our perceptions
Our information diet is shaped by a tiny sliver of humanity whose job, identity, or obsession is to post constantly
The experience of interacting with others on social media is becoming more and more distant from reality. Jay recently wrote an article for the Guardian exploring this disconnect. He explains how social media creates a deeply distorted picture of society, amplifying outrage and division far beyond what most people experience in their daily lives.
The gap between the inflammatory tone of online discourse and the relative calm of everyday life has grown more pronounced over time. A recent article from our lab by Claire Robertson, Kareena del Rosario and Jay explains how social media functions less like a neutral reflection of public opinion and more like a funhouse mirror—creating a distorted reflection of reality.
A small, hyperactive group of users—just 10%—produces about 97% of political content on platforms like X (formerly Twitter). Even more striking, only 0.1% of users are responsible for 80% of fake news. This small group of "super-users" skews public perception by flooding social media with extreme, emotional, and often misleading content.
While these extreme voices dominate the conversation, people with ambivalent, neutral, or more complex views tend to stay quiet. This leads the an invisibility of moderate opinions (see Fig 1 below). Instead, people scroll through a newsfeed overloaded with the extremes.
These users shape how others perceive social norms, because humans naturally rely on cues from others to understand what’s typical or acceptable. On social media, these cues are distorted by extreme voices. Algorithms designed to maximize engagement tend to prioritize divisive or surprising posts, further boosting the most inflammatory users. As a result, users are exposed to a narrow, sensationalized slice of opinion that makes society seem more polarized and hostile than it actually is.
This creates a vicious cycle: seeing extreme views leads people to believe those views are widespread, which can shift their own expressions and beliefs. Even moderate users may exaggerate or perform outrage to gain attention and social status online. Over time, this fosters pluralistic ignorance, where individuals misjudge what others actually think or feel. The overperception of moral outrage in online social networks then fuels beliefs about intergroup hostility.
However, this cycle is not inevitable. In another study from our lab, led by Steve Rathje, participants were paid to unfollow the most divisive political accounts. Within a month, they reported 23% less animosity toward opposing political groups. Nearly half chose not to re-follow those accounts even after the study ended. The effects lasted nearly a year, showing that small changes to online habits can meaningfully improve our outlook.
Users can take back control by curating their feeds, resisting outrage-driven content, and refusing to amplify toxic voices. Platforms could redesign algorithms to prioritize more balanced and representative content. In short, if we recognize that social media is distorting reality, we can begin to reshape it into a healthier, more truthful space.
News and Announcements
We are also pleased to announce that Steve Rathje and Jay wrote a review about the psychology of virality that was recently published in Trends in Cognitive Sciences.
Speaking of going viral, Steve recently made an appearance on the CBS morning news to share his expertise on how to navigate misinformation on social media. Check out and share the segment, and congrats to Steve!
The Center for Conflict and Cooperation is preparing a new study (led by postdoc Rémi Thériault) to examine how non-fiction books influence affective polarization—that is, the growing dislike and distrust of members of the opposing political party. We are currently collecting book suggestions to test through AI simulations and we would be grateful to receive suggestions from our readers as well. Please suggest your top 3 non-fiction books to reduce affective polarization through the following link: https://nyu.qualtrics.com/jfe/form/SV_erMYBfkgGqwsdVA
Diego Reinero, a lab alumni, recently co-authored a paper in PNAS that tackles how we can motivate climate action. The research team tested 17 psychological interventions head-to-head in a “tournament” and found that certain inventions, like writing a letter to a future generation, were more effective than commonly used strategies like sharing carbon footprint information. Awesome work Diego!
Another lab alumni, Sarah Grevy Gotfredsen, was recently admitted to the PhD program at Copenhagen Business School. We are so excited for Sarah, congratulations!
In other good news, a research assistant at our lab, Xander Flores, is interning at Carnegie Mellon University's Human Computer Interaction Institute this summer, where he is building tools that give more control to users about the information that AI's collect. He also won a grant to conduct a project on moral tipping points in social networks. Congrats Xander!
Finally, this news is coming to you from the Center’s new lab manager, Hannah Karsting. I am so excited to be here, and I hope you are excited to hear more from me.
In case you missed it, last week we had an interview about “Misguided: Where misinformation starts, how it spreads, and what you can do about it” with Matthew Facciani. Click below to read the interview and enter our book draw.
This column was drafted by Sarah Mughal, with additions from Hannah Karsting and edits from Jay Van Bavel.