In the last few weeks I’ve seen what I can only call algorithm anxiety. The algorithm is fracturing society by bubbling us in echo chambers.
Well, I’m going to tell you why it’s not Facebooks or Googles fault people are “bubbled” and why changing their algorithms:
1) will make no difference to the outcome;
2) would be a terrible idea for free discourse;
3) and what strategy might help better inform people.
So let’s jump right and talk about….
The Bubble Effect
The bubble effect has been in place almost as long as humanity itself. If you don’t believe me, read the Bible.
Look around you and see how many you can name. The suburbs. The alternative kids in high school. The yummy yoga moms. The council estate kids. Village life. Academia. Startup culture. Home schoolers.
Next time you’re on a half empty bus in a diverse city, really pay attention to how people fill out seats. Buses are amazing displays of people self-sorting into similar and dissimilar affinity clusters so they don’t have to even share space with someone too dissimilar to themselves if at all possible.
The Bubble Effect and Facebook
Even with Facebook’s algorithm and the options to mute or unsubscribe from certain people you’re connected to, a lot of people I know found social media in the run up to the US election to be an emotional minefield. Rather than start arguments with family members or coworkers, some avoided it entirely. A social network is the people who log into it. It’s an organism made up of people. To a social network, avoiding logging in is death.
Facebook’s first priority is to keep their network alive and active. Let’s not forget that since Facebook expanded beyond college registered students, I’ve seen a lot of supposed “Facebook killer”s, all aiming to fix a different perceived problem with the platform, but all have stagnated, stayed niche, or shut down. That’s the difference between what people say they want and how they behave.
Mind the intention-behavior Gap
This is called the ‘intention-behaviour gap’. It’s why we say we going to get healthy, but then order that pizza, or that more people ought to buy domestically made products, but keep buying the lower cost, made-in-china stuff.
Facebook have survived by being aware of and reactive to that gap through continually testing different page layouts and different page feeds and seeing what actions people take next. Like all evolutions, it may not result in a perfect organism, but the one best adapted to the current climate.
As it turns out, we are not so rational when it comes to news that doesn’t match our beliefs. In a 1995 study, researchers handed fake stories that matched the subjects feelings on the Iraq war. After handing them the fake story and confirming that they believed it, the researchers handed the subject a true story and told them the previous one was fake.
The subjects overwhelmingly felt the true article was “just an opinion”. Far from being persuaded, people become more convinced of their previous feelings or feel personally attacked when they are simply being presented with an alternative point of view.
Too Many Friends
“Facebook says that the typical user has about 1,500 stories that could show in the News Feed on every visit.” That’s 1,500 stories EVERY time you load facebook.
This is why filtering is necessary. And given what we’ve seen above about the futility of persuasion, what would you have them do?
Secondly, the idea that Facebook should be influencing people’s beliefs or fact-checking stories posted to its’ platform is creepy and dangerous as a precedent because of how many governments or private parties may also request their assistance in persuading the public.
The Reverse Dog whistle
It is not the job of a social network to inform and educate. We need journalists and lobbyists to better inform and educate the public in a more nuanced, yet digestible way. And the truth is, both have failed to do so effectively. You may think that endless pieces calling Trump crazy, sexist and racist or the fact-checking the claims of Brexit leaders was achieving that, but as a strategy, this method does not work to persuade. It is a “reverse dog whistle”: it only calls the attention of people who already agree with you, and makes the other side more repulsed and defensive since they don’t see what the author sees.
Although I find research based on brain scans highly debatable, since we’ve heard so much bile against “experts” recently, it’s worth bringing up that liberals and rationalists tend to see the growing problem of right-wing populism as being solved with education and facts (essentially, Merkel’s appeal to google boils down to the hope that different information would be convincing), while what data we do have on political leanings and persuadability suggests that storytelling that appeals to readers emotions would be more effective on people learning towards populism. Fact checking articles don’t just fail to persuade, they actively make people more convinced of their leanings.
Yes. I said that. Data lover said “use storytelling, not data, because data”. I hear myself.
But if the goal is to better inform people across Europe in the coming elections in France, Germany and the Netherlands, the present methods of presenting both sides neutrally does not work. Presenting one side factually and calling out the others lies does not work.
Human libraries, or facebook live casts of real people that defy negative stereotypes and their stories are probably more effective than posting fact checks which only liberals will read. Get a moderator in, so people can express why they feel what they do and have their concerns met in a non-accusatory way. This method has shown efficacy in bringing people around.
So apart from Facebook death and futility, what would be bad about unbubbling people?
Remember how I said that people hold onto their beliefs even more strongly when confronted with facts (not an opinion, actual facts). Best case scenario, assuming people don’t become even more right wing or log off: a tonne of online ranting inspired by other people’s posts, and then a tonne of counter-ranting on facebook.
Please. No.
The Bubble and Google
As I’ve said above, if people don’t believe live researcher telling them the article they just handed them in person is fake, and if they don’t believe the British Medical Journal saying vaccines have no links to autism, they’re not going to believe a search engine result.
Yes, Google do personalise search results. This is why when I say “Lumos” into my phone, my foodie search history offers me search results for “Hummus recipes”. But much like Facebook, what else can they do other than lower the quality of their product by not offering personalised results?
And should search engines be trying to influence people? Or present “all sides” to every search? If I’m googling cancer treatment options, should I see juice fasts because that’s the alternative treatment? If I google “gay rights in Russia”, should I be presented with articles explaining why homosexuality is wrong?
I see neither a moral or business benefit to trolling and patronising people in this way.
Why revealing their algorithm would be awful
I do SEO and have done for a long time, albeit not as a specialisation. In the earlier years of Google, ranking factors were pretty clear and simple. I’ve seen every hack done to get frankly awful sites to the top of the search engine because it was so simple. It was super easy for low-ethic sites to get to the first page using keywords to trap a demographic into a competing product, for example.
Now, imagine if the algorithm was exposed, how anyone could boost the visibility of extremist pages even further or create fake sites to phish the personal details of far right supporters, minorities or other vulnerable groups for targeting. Don’t kid yourself that no one will do this. Pundits on both sides are pretty motivated at trolling, harassing and doxing the other side.
Worse than having it public, the idea that only government-approved content would have access to the secret sauce is horrifying when projected globally.
Remember that the UK spent 9 million pounds on flyers advertising the benefits of ‘remain’ to every UK home and the public were still not convinced. What bubble was to blame there?
To Summarise
1) people do not simply change their minds when presented with information contradictory to their beliefs. This is irrespective of political leanings.
2) Journalism and lobbyists have used ineffective methods to persuade people thus far. Use storytelling more, appeal to emotions more and facts less when trying to reach populist voters. I KNOW IT HURTS to read that.
3) Since each publication has a target audience, journalists do not – largely – aim to persuade the counter-side, only hold on to their readership. This is an inherent problem in a free, self-supporting press and I don’t see a way around that.
4) You can log out of search to see non-personalised results or use incognito mode in chrome, btw.
5) Blaming two companies when you fail to convince the public is like blaming the dog for eating your homework. It doesn’t help anyone’s credibility.
Moodthy Alghorairi is a product designer and digital consultant behind Wyld.Media. She’s been designing digital experiences since 2002. She’s a runner, mama to Floki (8 y.o parrot) and Thais (3 y.o human), and head geek at MadridGeeks.es. Follow her on social media below, or sign up for the newsletter to get new posts in your inbox.
Leave a Reply