OPINION: Reality distortion fields: social tools gone wrong

David Hormell



Advertisers wanted to increase their online influence, but generalized advertisements tended to be too macroscopic. The advent of Google AdSense in 2003 punted targeted, contextualized advertising tactics to mainstream status.

Based on location and searched keywords, Google crafts a tailored ad experience to the user. According to internet trend analysis agency BuiltWith, over 12 million websites now use Google AdSense.

Contextual advertisements experience more considerable success because they’re based on the user’s likes and dislikes. I’ve received a few contextual ads today, imploring me to revisit New Mexico, to see John Mulaney in Louisville and to listen to Passion Pit’s new album.

The uptick in the popularity of data-mining has inspired social media sites to adopt a similar approach. On paper, it makes sense: the internet is always sprawling and expanding. People want to manage the informational flood and make sense of the static. On Facebook, users are capable of “hiding” posts and “unfollowing” friends. Twitter enables users to “mute” words and users. A happy user experience translates to increased user engagement and as a result, more revenue for Mark Zuckerberg and Jack Dorsey.

Unfortunately, the rise of the tailored user experience means the rise of echo chambers, where one logs on Facebook or Twitter to validate their opinions and feelings. It’s the equivalent of digital junk food. It accomplishes little, other than providing some fleeting felicity. By actively constructing personalized mediascapes, we run the risk of accidentally crafting a reality distortion field where facts bend at will and where truth is conflated with opinion.

In The Filter Bubble, author Eli Pariser writes: “Personalized filters sever the synapses in the brain. Without knowing it, we may be giving ourselves a kind of global lobotomy.” Instead of the internet existing as a global intersection of thought and digital discourse, isolation from differing viewpoints render it fragmentary.

In the last 15 years, the internet user is placed in a position of power as an active consumer.

It’s a remarkable concept. In reality, it’s a digital dystopia predicated on predictability.

After an unusually contentious election, Public Radio International (PRI) published an article entitled “Sorry Mom, the kids won’t be coming home for Thanksgiving this year” by Caitlin Abber. The article centered around the trend of millennials skipping Thanksgiving in hopes of avoiding “inevitable arguments.” It’s just one example of electronic thoughts and personalized mediascapes materializing in daily life in painful ways.

The anxious nature of the internet is unbalanced and unhinged, teetering between chaos and randomness. Neither outcome is comfortable. Contextualized ads and tailored user experiences can help make sense of the compounding complexities of the internet to a certain extent.

Some social tools are helpful when used responsibly. Muting certain words, hashtags or users on Twitter for a set period can decrease notification fatigue or prevent ugly cases of harassment. But the incorrect use of these tools – to unfollow your friends of differing political propensities – contradicts the goal of civil discourse and the initial intentions of the internet.

Each person receives a preselected digital narrative, and there’s a lot lost in translation. We miss out on differing viewpoints and consequently contribute to further political polarization.

And we’re fine with it.