Social media is growing, but it appears our ability to critically evaluate it is not keeping pace, writes Anushka Britto.
I WAS READING a book a while ago that had a powerful image, which has stood out to me more than the name of the book, which I now can’t recollect.
It discussed the idea that the great predators of the wild – lions, tigers, the big cats – have evolved over millions of years and held their place in the hierarchy due to their advantages over other animals. Yet man (or woman), who was once at their mercy, has suddenly and swiftly overtaken these creatures in the dominance of the animal kingdom, through the creation of guns and weapons.
The book suggested that it was unfair, that these majestic animals who have held court over their habitat for millions of years have been overpowered by a species who has evolved so quickly. Their dominance so inbuilt in their biological coding; they didn’t really have a chance to defend themselves against this new superpower.
While science has let us evolve at a rate that is faster than that which would naturally occur, is it the duty of science to provide support and regulation in integrating this change into our ecosystem? Human beings have long surpassed the advantages that fire, gunpowder and farming the land have brought us. We have invented the wheel, the steam engine, the aeroplane and the computer. While change is the only constant, the pace of change in the last decade far surpasses the rate of change we have ever seen before.
Is it the responsibility of change-makers or governments to ensure that these changes are integrated into our society in a way that is regulated with governance and oversight? Is it the responsibility of creators of change to support the creatures who have been unfairly disadvantaged by this change?
The human race no longer displays dominance through physical strength and aggression. This is the age of information. Information can enable us to make informed decisions which allow us to build wealth, health and happiness.
Each generation has raised the succeeding generation by passing down knowledge on how to make informed decisions based on human interactions, identifying risks and hazards, scanning the market, and weighing the credibility of information. Previously, the rate of change was fairly constant, allowing each new generation to progress, yet still allowing older generations to learn and adapt to the new technology introduced into our human ecosystem.
However, the introduction of social media platforms has reshaped the way that information is shared, questioned and challenged — making incorrect information much more accessible than it traditionally would have been.
Social media has recognised this and has started taking steps to address this critique. Facebook has partnered with a number of news sources to introduce a “news” tab on its U.S. user interface, with a trial planned for 200,000 U.S. users. Similarly, Twitter introduced a feature to place orange and red badges underneath posts shared by politicians and public figures that are harmfully misleading.
Perhaps this was a direct result of misinformation in the U.S. Presidential Election in 2016, in which a Google search for final election numbers and winner of the popular vote, directed users to fake news blogs which provided false information. Both Facebook and Google are now committing to keep advertising dollars away from fake news sites.
I grew up in the age of social media. I was 15 years old when I got Facebook, but much younger when I had a Bebo account and frequently changed my background theme to reflect my self-professed vibrant personality. But Facebook was the start of my interactions with other adults on social media — mainly my family but also coworkers and older friends, when suddenly at 19 or 20, everyone seemed to have it.
I was already an adult when Facebook became huge and parents started to worry about what we were posting and who we were talking to. The concept of cyberbullying and identity theft had just come up on their radar. The idea of theft of our private information, our ability to make decisions based on information we sought out for ourselves and control over our screen time, was not yet on their radar — that came much later.
According to the documentary The Social Dilemma on Netflix (a lot of people deleting their social media accounts in response to some of its claims), one could conclude that social media may have caused more harm than good.
Jonathan Haidt, an American social psychologist, claims that between 2011 and 2013 there was a dramatic increase in cutting and suicide rates for teenage and preteen girls — this being the time that social media became huge. The documentary claims that social media has led to a "tripling" of self-harm among pre-teens in the U.S. and a "150 per cent rise in suicides”.
For children entering their teens, navigating the emotional rollercoaster of figuring out who they are and how they relate to their peers can be a stressful time. Add to the mix a direct and constant reflection of your peers’ critique of you and you have a recipe for disaster.
Social media platforms were created by people who love technology and creating a cool product to spread positivity. Justin Rosenstein, who helped create the Facebook "like" button, said in October 2017 that it is very common “for humans to develop things with the best of intentions and for them to have unintended, negative consequences.” Aside from the potential addiction to social media and impact on young people, social media has its own filters for "fake information" and also suggests similar stories to us based on our interests. But is it the duty of social media to teach people critical thinking?
Perhaps the most recent example of a lack of critical thinking can be seen in relation to the U.S. Election. Regardless of who you thought should win, the disparaging remarks I have heard and division I have seen with those who hold opposing political views are in my opinion, symptoms of the "us and them" dichotomy that social media tends to exacerbate. The inference to be drawn is that we have more difference between us than commonality.
When I tried to engage people in conversation and explain that actually, studies show we tend to overestimate the differences we have to other people, this was met with disbelief, outrage and ignorance.
Last year, sociologist Arlie Russell Hochschild from the University of California discussed the results of a national survey about political beliefs in America. She explained how Democrats and Republicans are bad at predicting others' world views due to "bubblism" — surrounding ourselves with people who think the way we do, with little opportunity to understand other peoples' perspectives.
Hochschild said, of one of the study's findings:
'The wilder a person’s guess as to what the other party is thinking, the more likely they are to also personally disparage members of the opposite party as mean, selfish or bad. Not only do the two parties diverge on a great many issues, they also disagree on what they disagree on.'
In fact, the more educated and politically engaged a person was, the less likely they were to understand the point of view of a person from the opposing party.
In her book Strangers in their Own Land: Anger and Mourning on the American Right, Hochschield documents how a majority of Democrats and Republicans had the same views on carbon emissions, the Paris climate accord, online violence and reforming the "draconian" criminal justice system.
Perhaps if we stepped out of our bubbles and actually talked to one another, we might learn something. Then again, the Pew Research Center's Election News Pathways project showed that 45 per cent of Americans have stopped discussing politics with someone.
While once we had to seek out reputable news sources and discussed our views with varied groups of people, is it any wonder that an algorithm that is designed to show us more of what we already like and want to see, is consistently reflecting our own views back to us — louder than ever in our own echo-chamber?
Like the human race which displaced the animal kingdom from its natural order with the invention of fire and gunpowder, technology in its current state is displacing our highly evolved ability to analyse and question information presented to us to make informed decisions.
People raised in a generation where they would go to the library and find credible sources like encyclopedias and dictionaries to look up information, are now unflinchingly absorbing Whatsapp forwards as fact. Just as we have recognised our impact on the animal kingdom and put in place measures to protect endangered species from extinction, is it time that the creators of disruptive technology also take accountability for the impact that it has had on human decision making?
Then again, where would we draw the line on how we expect a social media company to try and monitor foreign governments like Russia's supposed involvement in the 2016 U.S. Election. It raises the contentious debate on what social media companies should be treated as — a socialising platform, a news source or a public utility company.
Humans have grown in intellect and wit, learning new skills to understand the environment around him or her and respond accordingly to dangers, threats and new information. Yet social media has far surpassed the normal cycle of change and is growing at an exponential rate, with new features and ways to change and modify information. In turn, our ability to critically evaluate this information and make informed decisions with it appears to be at a loss.
Some may argue that it is the responsibility of the end-user to adapt to this new ecosystem.
'... [people] who get their news mainly from social media are more likely to believe fake news about coronavirus... those who get their news from more traditional media sources are more likely to follow public health recommendations.'
If this truly were a "survival of the fittest" situation, it would be up to people learning from these studies to initiate change through deliberate actions like deleting Facebook accounts, migrating to less invasive platforms and reducing screen-time to adapt and evolve to the new environment of misinformation.
Perhaps this would make some groups of people predators at the top of the food chain, where the dominant lion of the new animal kingdom is really a critical thinking tech-savvy consumer of information with the ability to decipher between real information and fake news.
But we don’t live in a cut-throat world of eat or be eaten — our Darwinistic days are behind us. Indeed the regulatory approach in African countries has been criticised and some suggest that educating citizens, not social media regulation, could be used to curb misinformation.
Should those who opened Pandora’s box be responsible for its outcomes? Is it the responsibility of the creators of social media to also teach its users how to critically evaluate the content that it allows to be created and circulated on its platforms?
Anushka Britto is a day-time auditor, night-time philosopher, writer and creative spirit who lives in Melbourne.
Support independent journalism Subscribe to IA.