Discussing “incidental similarity,” information silos, and how comments on social media can drown out experts and empirical evidence.
Illustration by Matt Titone
Assistant marketing professor T.J. Weber was a Ph.D. student at Washington State University when he ran across a scrap of information that led to his present line of research. Instead of discovering it in class, however, or deep in a catalogue of assigned reading, he was simply on his way to class, cocooned in a pair of headphones.
“It was expensive to buy a parking pass on campus,” he says, “so I rode the bus every day and listened to the news. I heard a piece on NPR about Science magazine, discussing how the editors had decided to remove the comments from their online articles. Basically, they felt certain arguments, from certain people, were facilitating skepticism toward the actual science.”
Seven years and several of his own published papers later, Dr. Weber and his co-authors have looked deeply into the persuasiveness of online commentary and how to deliver convincing information amid the noise of our current digital landscape. It’s a body of work that in part unpacks the human impulses behind current hot-button topics, like the damaging effect of information silos, and the divisiveness of online discourse in modern society.
In test cases, which cover arenas like the vaccination versus anti-vaccination debate and global warming denialism, Weber and his team found audiences were often more likely to believe online commenters than professionals or credible authorities, such as the Centers for Disease Control and Prevention.
Thankfully (for those interested in empirical reason), the research also showed that an audience, even one with strong preconceived notions, can be persuaded to trust fact-based sources and experts, rather than their online counterparts. The catch is this only works if data and scientific explanations are presented in a specific way, in a specific order. We caught up with Weber for a chat to unpack this concept.
It seems like your work basically started with an interesting news story, which you ran across and took deeper.
Yeah. After hearing about the comments with regard to Science magazine, I thought this would be a really interesting thing to test—to see if people posting their own opinions or information alongside other information really has an impact on an audience. So we mocked up these public service announcements about vaccinations and added comments with gender neutral names and basically made the message as simple and sterile as possible. And what we ended up finding was, even on posts from the CDC, commenters have about twice the persuasiveness than medical experts when posting on Facebook.
How do you account for this effect?
There’s a concept called incidental similarity. And the idea is that if you’re on a website, and you see someone with a shared interest in similar content, who then comments on it, and they maybe share similar opinions, you infer that you can trust them because they’re like you, versus any other stranger or a government agency. This happens because we all have this implicit egoism—the idea that we like ourselves, that our self-esteem is positive, even with people who are very inwardly negative. If we perceive we’ve come across people who are similar to us, we automatically like them. That’s why we tend to trust people online, even if they’re unidentified, more than sponsored sources because we infer they’re more similar to us than sponsored sources.
So in an age where something being discussed in media is the idea of information silos, does this phenomenon explain why we continue to become more of a polarized population? People seek out others who reinforce their preconceived notions, and that perpetuates ideas, or concepts, or disinformation?
That’s one of the dangers. We’re kind of conditioned in the setup of both the current media ecosystem and how we interact with each other to think there are always two sides to an issue. I mean, there’s usually two sides to any large issue, right? There’s pro-life versus pro-choice, pro-tax versus anti-tax, private health care versus single-payer system. Obviously, what we found, however, is that many arguments are much more complex than that. And if people don’t have a firm grasp on all the nuances of a position, they can be much more swayed by comments from individuals who they feel have similar viewpoints. And this type of value judgement can lead them to simply side with their constituency group rather than evidence.
In our digital world, is there a way to use empirical information to get an important message across, or is it all downhill from here? Will we always be swayed by social influence, or a nearly unconscious set of values we place on a source of information?
It’s kind of a subtle distinction but we asked people as part of one of our experiments to explain the process of the greenhouse gas effect. And we showed them two videos. The first group only saw a film that focused on the consequences of global warming versus the second group, which saw a video that very simply explained the underlying mechanisms behind it. Then we asked those two groups to explain global warming back to us. One of the things you’ll hear among faculty or anyone who teaches is that if you really want to learn something at a deep level, you teach it because, when you explain something out loud, you’re forced to really understand it. So my coresearchers and I found that by taking a topic and simply asking someone to explain it was crucial. But the group that learned the underlying mechanism had a better understanding of the topic and were more likely to trust that information, internalize it, and believe it. If people understand a concept at a basic level, in this case in terms of cause and effect rather than simply in terms of outcomes, it has an impact.
Pulling back to what your research ultimately shows then, it suggests there’re ways to present information so it has the best chance of being convincing for its audience and having resonance. Obviously, being a marketing professor, that’s a useful insight.
Marketing is probably the most powerful tool we have in society. It can be used to help people make better decisions. That’s the core of what I think. My research is just applying that mentality to the biggest problems.