Limitless Knowledge and Limited Time

AP/Vincent Yu
AP/Vincent Yu

Two British-style phone booths on campus, gifts of the George Washington University classes of 1998 and 1999, are gone. They’ll be replaced by benches. Nobody needs a phone booth today, when every student has a phone — and, via Google, all the information ever collected in the history of humanity — conveniently in a pocket. And who has time to sit?

In the age of multitasking the idea of doing more with less (or less with more) is a common notion. We read more than we used to, but get criticized that what we are reading and how we are reading it is making us stupid.

The gist is that instead of reading books and long form news articles, we read shorter and quicker; in essence, we skim. You can’t read Ulysses in 140 characters.

The staccato rhythm of various tweets, blogs, and blurbs cuts the content of our reading to skimming. News articles are reduced to updates we can get on our phones and we seek out reports slanted to reflect our preexisting biases with their analysis.

The key reproach is that in this multitasking world, we don’t analyze enough. We don’t use critical thinking.

In the great tradition of Western thought and education, critical thinking has been the foundation of intellectual civilization. This theory assumes two things though: one is that the old form of reading gives us access to the same amount of knowledge we get now, and two is that we are missing something critical by not using the old form. Now we have instant access to almost all the information in human history, but the same amount of time to read it as before.

Time is the crux of the argument.

When the modern clock came into existence in the 17th century, humanity’s perception of time changed. It became mechanical, precise, and fixed. No matter where you were or what time of year it was, there were still 24 hours in the day. This mechanization of time would go on to define the scientific revolution and the Enlightenment as humanity began to see itself as nothing more than a machine with a biological clock ticking away.

In the digital age that notion hasn’t changed, it’s gotten stronger and more precise with the invention of the atomic clock and the coordination of GPS technology. And as before, humanity is following suit as we think of ourselves as advanced computers rather than clocks. Now we refer to our brains being “hardwired” to do such and such or using analogies of hardware and software to refer to the difference between the brain and mind.

The question remains though: Are we missing out on something critical by absorbing more information in the same amount of time? Are we no longer the deep-thinkers we once were?

No, instead humanity is adapting to a new world that requires the quick assessment of new information. This is indispensable in everything from intelligence gathering to STEM research and entrepreneurship. Analyzation isn’t disappearing, it’s adapting; learning to do more in less time.

One lesson humanity has learned from history is that those who try to predict it are usually wrong. We don’t know what human civilization will look like in 100 years. In the 1960s it seemed inevitable that humans would be colonizing the moon by now (something I am still waiting for).

What we do know is that we are changing. Change can be either good or bad, but it is always inevitable, and trying to stop this change is like trying to stop time. And one thing we as a species have learned since we started clocks ticking is that time keeps marching forward, no matter what you think about it.

Marc Furtado is a graduate student at George Washington University

COMMENTS

Please let us know if you're having issues with commenting.