sofalarity

Excessive Time Spent With Electronics Warns Of A Coming Sofalarity

The singularity, some say, is near. But the sofalarity may be even closer.

The singularity is a hypothetical moment in time when artificial intelligence and other technologies will match and exceed human intelligence, merging biological and technological functions to alter humanity as we know it.

Some futurists put the singularity as soon as 2045. Others call it ambiguous, bullshit, and  “a bunch of hot air.” Regardless of what the future holds, though, the sentiment is clear: with technology getting smarter, changes are abound.

But as we’ve touched upon before, a rise in tech does not necessarily a smarter or healthier person make. Which is why some futurists speculate we’re even closer to an alternative moment, in which humanity spends less and less time with reality (and more time on the sofa) due to the enticing wiles of the web.

The sofalarity: what is it?

The potential of AI is still unknown, but the rise of mobile devices is nigh. As smartphones, tablets, and other handy electronics rise to ubiquity, humans are spending more time engaged with technology than ever before (more on this later).

This may lead to what some call “the sofalarity,” defined by Rick Searle as a hypothetical stage “when the majority of human beings spend most of their time engaged in activities that have little or no connection to actual life in the physical world.”

To our knowledge, the term was first posed by Columbia Professor Tim Wu (also known for coining net neutrality) in a column for the New Yorker. Searle elaborates on the concept in an illuminating 2015 article for the Institute for Ethics and Emerging Technologies (IEET).

The sofalarity sounds a lot less glittery and exciting than the singularity, to be sure. But it might also be more realistic.

How close are we?

Let’s take a moment to get an idea of where exactly we are now, in terms of time spent with little connection to the physical world:

chartoftheday_1971_Electronic_Media_Use_n

According to a 2014 Nielsen report, Americans spend 11 hours engaged with technology, most of which is spent on television, followed by radio, mobile, and PC use. Results, of course, will vary by location and demographic.

Now, not all of this is totally removed from physical reality: mobile and Internet time might be spent texting, on social media, or working, which is more of a filtered reality than an alternative one. Television and radio may be used for news consumption, which is engaged with reality in a way, too.

Even so, 11 hours is a majority of a person’s waking hours. The chart does not take into account other activities that could be considered “disengaged with reality,” like reading books, playing physical games, daydreaming, or altering one’s mental state with drugs and alcohol.

This is not to say any activity — electronic or otherwise — that takes us to another place is bad. In fact, games and fantasies boost brain power, and adequate leisure time is essential to aid in productivity. It’s only when the pendulum swings too far in one direction that we could have a problem.

Already, Searle aptly points out, Japan boasts at least 700,000 individuals living as hikikomori: reclusive young adults that retreat from social life entirely to live out their lives online.

Considering there are already various reported health concerns associated with excessive technology uses, such as visual, auditory, and sleep disruption, only time will tell if this kind of lifestyle is a safe one.

What it would mean

The sofalarity is perhaps most extremely, if humorously, depicted in the Disney-Pixar film Wall-E. In Wall-E’s world, humans have been resigned  to cushy lives of consumption in an elaborate space station above the trash-ridden wasteland that Earth has become. Relieved from labor, exercise, stress and challenge in general, humanity is content to slip into easy lives of obesity and digital comfort.

It’s funny, but in certain ways chillingly plausible. Given rising obesity rates, rampant consumerism, and the often lecherous entertainment industry, it would take little more than robots overtaking the workforce (to the point at which humans no longer needed to work) for such a cartoonishly pathetic world to come to fruition — for the wealthy, anyway.

According to Wu, this type of future would be the result of an evolution that eliminates earthly discomforts. Were utility to take care of itself, is there much to stop humanity from dwelling in escapist technology and fantasy on the regular?

Given the time and money spent on porn, television, movies, and playing games (not to mention advanced virtual reality right on the horizon), it’s hardly a stretch to imagine that we could be just as easily dumbed down by technology as we could be enhanced by it.

Fighting or embracing the sofa

Just like the singularity, the sofalarity is a seemingly radical theory that within its core contains real concerns about the digital evolution’s current trajectory and implications.

But the sofalarity is far from inevitable. In fact, there are factors that may render it improbable in the extreme. Studies show that the happiest humans have good personal relationships, actively seek goals, are not driven by materialism, and avoid “if only” fantasies. They distance themselves from technology, spend time with people, animals, and nature.

Though increasingly some feel satisfied without these, like Japan’s hikikomori, it seems unlikely that the majority would divorce themselves from the biological satisfaction of physicality.

As for technology and its ill-effects, futurist Jon Perry countered Wu’s article in 2014 by writing that current pitfalls are a phase, and that “as technology advances, we will most likely engineer healthier, better-tasting foods and find better ways to encourage exercise.”

As long as this is the case, technology that aids physical goals, hopes, and responsibilities will accompany, if not overcome, that which seeks to elude them.

We measure success by the understanding we deliver. If you could express it as a percentage, how much fresh understanding did we provide?




Jennifer Markert