New Article

 

17021743_2928865735060_4628339908361151971_n

Responding to “Fake News” in a Post-Truth Era

Accusations that have no basis in reality can be surprisingly damaging. But there are some ways to weaken them.

There is little debate that we are entering a new era in crisis communications. The proliferation of algorithmically-driven social media platforms allows erroneous claims and “fake news” reports to propagate with unprecedented speed. This is being made all the more worrying by Donald Trump’s White House, which not only lends credence to questionable information to further its narrative but is, in many cases, an instigator of fake news.

Not long after Trump became president, his counsellor Kellyanne Conway introduced the phrase “alternative facts” when defending inflated claims about attendance numbers at his inauguration by White House Press Secretary Sean Spicer.

Trump’s recent assertion that Sweden was having “problems like they never thought possible” because “they took in large numbers” of refugees went viral and was largely disputed. But what makes the spread of this information particularly dangerous is that, according to research, misinformation takes hold more rapidly and more easily in populations that perceive themselves to be in an insecure position. Trump’s base, therefore, ran to his support.

Given that fake news instigators thrive on stoking their followers’ confirmatory bias, the tendency to favour information that confirms existing beliefs, they present serious challenges to the reputations of those they attack. In his book, On Rumours: How Falsehoods Spread, Why We Believe Them, and What Can Be Done, Cass Sunstein, a professor at Harvard and formerly President Barack Obama’s administrator of information and regulatory affairs, suggests corrections or counter-information to false rumours, lies, or “alternative facts” are very difficult, and should be a matter of public concern. In many cases, therefore attenuating them may be the only hope.

Correcting fake news

These alternative facts can also be durable. In a paper, “The Continued Influence of Misinformation in Memory”, Colleen Seifert at the University of Michigan noted that there is a “continued influence effect”, where misinformation continues to influence judgments even if that information has already been corrected by the accused.

This is one reason why conventional communications tactics of responding to fake news or false claims with condemnation and retort have so far proven inadequate in the post-truth era. Responding with outrage has also fallen short, making the accused appear to be “crying wolf” and easily painted as hypocritical or over-reactive, as Hillary Clinton learned in the presidential debates. Successful correction, Seifert goes on to state, “appears to require assisting the reader in resolving this contradiction.”

Ways to respond

These powerful “barrage” tactics, therefore, require a new kind of response, suggestions for which I have listed below. These are not exhaustive, nor are they a process to follow, but considerations for responding when under attack. In many cases, however, responding can only go so far so there are limits to the effectiveness of these counter strategies. Your counter information may never make it past the biases of the hardened followers of the accuser, no matter how clear the message or pure the intentions. Your aims should be to target those in the middle who are either undecided or interested in furthering rational debate.

1. Condemn and turn the argument on the accuser. While condemnation may be necessary, be careful not to repeat the instigator’s claims lest your outrage become fodder for their followers’ entertainment. Turn the argument around by making strong points or asking pointed questions to demonstrate that the emperor is not wearing any clothes.

Trump’s recent claims that something happened “last night in Sweden” elicited a very calm yet pointed response. The Swedish embassy in the U.S. tweeted in response, “we look forward to informing the U.S. administration about Swedish immigration and integration policies.” Even to those not following developments, it served as a stand-alone statement that might simply be construed as an act of friendly information sharing, staying well clear of the original claims. Even Trump, the accuser, was forced to admit the source of his claim, which turned out to be a widely debunked Fox News report.

2. Communicate your values. Research by INSEAD Professor of Organisational Behaviour Charles Galunic found that when companies differentiated their espoused values from their peers and updated them over time, they outperformed their peers.

The response of U.S. CEOs to Trump’s first proposed immigration ban on citizens from seven majority-Muslim countries both produced a memorable response that loomed larger than claims that immigrants were bad for America, but also drew a line in the sand when it came to values. This also builds on the traditional crisis communications strategy of taking the moral high ground, especially if you have it. If you engage in the same sort of pettiness, you’ll end up with the dirt on you too.

3. Be funny. It may be tempting when responding to fight fire with fire, but in the increasingly divisive nature of fake news, this often reinforces the accuser’s narrative, confirming stereotypes or negative beliefs that their followers already have about you. Refusing to play the “enemy” role makes it more difficult for the instigator to pin you down and demonise you to stoke more support for their ideas.

One very effective way to respond is with humour. Research on humour shows that when people in a conflict situation are exposed to humour and given pause to laugh, convergent thinking (the tendency to believe in only one solution) gives way to divergent thinking, which unveils other possible outcomes for the conflict. This is also reflected in other research on schools and offices showing humour as an efficient aid for creativity.

4. Or consider not responding at all. There is also such a thing as engaging too much. Depending on the ludicrousness of the claim, it could be better to wait it out. McDonald’s learned this lesson when a series of fake stories spread online that it was using worms as filler in its burgers. Eventually it stopped responding and let the story run out of steam. It was subsequently found to be false. It will be important to pick your battles and set a cut-off point for making further counter claims. Consider whether it is something that is likely to die out in the news cycle or something you need to kill.

All aboard the fake train

Preparing for the day of a fake news attack will be challenging as it will be difficult to anticipate exactly what form it will take and where it will come from. But a good start for companies will be forming a holistic perspective on what their reputation looks like, especially to their biggest detractors. This will also reveal the kind of “alternative facts” already circulating among the audience so they can start readying counter arguments. A crucial part of such an exercise should be to unearth the saliency of your messages. In essence, question if people listen to or believe in your narrative.

Used wisely, humour is often a good response to fake news attacks, which can involve seeding funny content to fans that will come to your aid in the event of an affront. It may not always be necessary to respond. In the event of a barrage of negative attacks, it is wise to pick your battles, remaining pointed in the narrative framework of the accuser and building memorable responses. Hitting back could be necessary, but not with a direct attack. This redirects a potential vicious cycle into a virtuous cycle for your reputation.

Ian Anderson is a PhD Candidate in Marketing at INSEAD.

Follow INSEAD Knowledge on Twitter and Facebook.
Read more at http://knowledge.insead.edu/blog/insead-blog/responding-to-fake-news-in-the-post-truth-era-5526#sZPzuPYgXcqyDFaz.99

Think Before You “Like” [or Follow]: Social Media Echo Chambers, Divergent Perspectives, The News, and Wisdom

Think Before You “Like” [or Follow]: Social Media Echo Chambers, Divergent Perspectives, The News, and Wisdom

In the past few days, the Baltimore protests have been the top news story in America. The media has covered the situation in relatively predictable fashion — each telling slightly different versions of the same narrative, many focusing in on the biggest selling points for ratings/virality—among them: property destruction and images of a burning CVS, arrests and clashes with police, a protester performing Michael Jackson’s ‘Beat It,’ and Tonya Graham catching her son and slapping him for participating in one of the protests (or “riots” depending on what channel you were watching).

Most of the biggest TV news outlets — CNN, MSNBC, FOX, etc. have followed their usual trend of neglecting the more substantive questions — the ‘why’ behind the violence — in exchange for video footage and sensationalist coverage that they hope will go ‘viral’ online or get eyeballs on TV screens, and make them more money. However, despite these common threads in their coverage, one can clearly see their narratives diverge and cater to their respective audiences. This trend has is just as disturbing as their lack of substance, and one that I see as widespread across the news media spectrum— both as presented on television and social media.

As a person of strong left-leaning political views, my Facebook and Twitter feeds are full of blogs and news outlets like Jezebel, Gawker, Black Girl Dangerous, Mother Jones, Salon, Colorlines, The Huffington Post, and others. For a couple years (since Facebook made itself into a good place for news gathering) I have been constantly reading and consuming this content. As a result of my desire to know what was happening in the world, my newsfeeds slowly became filled with a significant amount of content from these pages, and content from my friends began to take a back seat — unless they were regularly posting content that I “liked” and commented on.

My work as a social media strategist made me take a more critical look at how this content was being delivered. Understanding that Facebook’s newsfeed and Google’s search both work through algorithms that are — for better or worse — made to help us see the ‘most relevant’ information to us.

The words ‘most relevant’ mean, loosely, that their goal is to deliver content that they think we’ll enjoy, click on, or will be helpful to users (which also helps sites like Facebook bring in advertising money and keep its shareholders happy). The stated intent of this algorithm-generated ‘tunnel vision’ is to improve user experience, and prevent us from seeing content that might annoy us or even cause us to leave the site in frustration (all social networks are deeply wary of polluting users with bad content and going the direction of MySpace).

Whether benign or malicious, these efforts also end up pushing users into social media “echo chambers” — either through their friend groups or through the pages they follow — where users see the same ideas repeated over and over again. If we see something we don’t like, we can simply unfollow a page or person at the click of a button — even report or block someone for posts we find offensive.

I’m not of the mind that free speech is a value to be held on a pedestal above all others, or in favor of sites having no content filters for hate speech or threats of violence, etc. — but there is something harmful about making users’ social interactions focused on people within their own camp and who share similar opinions. This seems to be the case, even with people who have diverse groups of ‘friends,’ because the algorithm knows that you “like” or click on. It is no secret, either, that diversity of perspective is important for intellectual and emotional growth — it is what helps us learn new things, gather different points of view to help formulate our own perspectives, and make our perspectives more nuanced.

Since so many of us use social media and Google search as our sole sources of news and information gathering, our perspectives are being filtered through somewhat narrow lenses. These divergent lenses, created by social media and other information transmission tools, are clearly contributing to deep divisions between people of differing opinions, and arguably making it easier to alienate, bully, and in many ways bringing us further apart — rather than closer together as many social networks aim to do.

As part of a kind of experiment to test these theories, I spent around 2 months sharing a lot of articles from the aforementioned places that I follow — including many taking strong sociopolitical stances (90% aligned with my own views) on current issues. The articles sparked discussion both productive and unproductive. I found that some of this content made me a target for the vitriol of people I considered friends or acquaintances — I had multiple threaten to “unfriend” me via messenger for posting some of the things that I did and attempting to engage/antagonize people into political dialogue. Others likely quietly “unfollowed” my feed or blocked my posts from theirs without going to the extreme of removing me as a “friend.” I noticed high levels of interaction and sharing on some of the most controversial or intense posts, but the overwhelming majority of the interactions the posts received were positive and supportive.

I took this experience as evidence of my own, personal social media “echo chamber.” I was seeing what I wanted to see, and thinking about what I wanted to think with minimal push-back. As an attempt to escape this, I tried to adjust the lens — I “liked” a few pages on the other end of the spectrum, including FOX News, Bill O’Reilly, and The Kelly File — in an attempt to catch a glimpse of what the ‘other side’ looked like. As I began seeing their posts in my feed, I noted that their lens seems to be equally constraining, and the divides that are expressed in the comments and the narratives they present around different events are incredibly different, but similar in their presentation and limited perspectives. Attempts to reach across these divides, as might be expected, particularly online, are met with unbridled rage and anger — on both sides.

I write this not to suggest that anger or frustration is unjustified, or call for greater ‘civility’ in social interactions. Sometimes, a bit of anger — even violence — is necessary for healing and positive action, as the protesters in Baltimore (and in other cases worldwide) have shown us all. I write to call for awareness of a potentially harmful and deeply divisive trend — and suggest we all attempt to seek out different perspectives.

I say this because there is a danger in allowing ourselves to wear blinders. In order to combat this tunnel vision and make ourselves wiser, we must take them off, and actively look for new information in unexpected places. Consciously choosing to do so can foster deeper understanding, greater diversity of thought and opinion, and help us know our enemies (and allies). Even if these attempts mean we only despise them more deeply, we must at least do the emotional and intellectual work it takes to understand, humanize, and combat them. Otherwise, we may never be able to grow — or completely heal.