Ditman Essay 1

Elliott Ditman
Knowledge and Information Management
Essay 1

Are Humans and Computers bound for Singularity?

In the past millions of years, life on this Earth has been radically altered. From single celled organisms, to what we now know, it has generally proceeded forward on a path to higher intelligence. If one were to look into the future, what would they see? One of the most resonating predictions of what life may look like one day is a world where humans and computer technology live in perfect harmony, known as the “The Singularity”.

What is the Singularity?

The author and futurist Ray Kurzweil calls the Singularity “a future period, during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed”. (The singularity is near, 7) In the past few decades, as technology has continued to change and evolve, this realization of the singularity has been just on the horizon. Twenty years ago, the average person would be lucky use a computer more than a couple times a day. Ten years ago, the computer had infiltrated almost all parts of modern business and recreation. Now, it’s hard to even imagine a day without computers or computer technology. The advent of the singularity promises to change society far beyond any of this.

Until now, all modern technology has remained fundamentally separate from the human body. The practical application of the results of the singularity will be devices that transcend this boundary. Microscopic nanomachines infused directly into the bloodstream, for example. Exploiting such a boundary could make possible countless opportunities- from impossibly long life, to knowledge management in its purest form. Further exploration would invariably lead to the changes which would bring about part of Kurzweil’s definition where “human life will be irreversibly transformed”.

The Brain and the Computer

Since science was a relevant topic in society, people have attempted to explain how the brain functions, or where memories and thoughts come from. The ancient Egyptians believed that the brain served no important function, and that the heart was responsible for the decision-making processes (http://www.pbs.org/wnet/brain/history/2500bc.html?position=179?button=2). It took another 2000 years before a Greek physician would proclaim that the brain, rather than the heart, was the organ where thoughts and sensations originated (http://www.pbs.org/wnet/brain/history/450bc.html?position=208?button=4). Over the next 2000 years, from antiquity to the beginning of 1800’s, the nature of the brain was fiercely debated, with very little explanation of its true functioning provided.

The twentieth century was different however; countless discoveries in psychology and neuroscience that remain relevant today were discovered in this century. Another important development of this century was the birth of computer science. Unlike, neuroscience however, where the top-down approach has proven more successful, computer science requires a bottoms-up approach. Or in other words, computer science requires creation, while neuroscience requires exploration of what is already there. In recent years, the two sciences have become more and more intertwined.

According to Peter J. Denning, the fundamental question underlying computer science is, "What can be (efficiently) automated” (http://web.archive.org/web/20060525195404/http://www.idi.ntnu.no/emner/dif8916/denning.pdf )? As the field has become more developed, the complexity of tasks that can be modeled and reproduced through a computer has increased. If the trend continues, it is realistic to assume that eventually even the most complex human thought processes and brain patterns will one day be reproduced.

“A machine is as distinctively and brilliantly and expressively human as a violin sonata or a theorem” Gregory Vlastos (http://www.aleph.se/Trans/Tech/index-2.html)

The ultimate knowledge management

Wikipedia says that knowledge management “comprises a range of strategies and practices used in an organization to identify, create, represent, distribute, and enable adoption of insights and experiences”. If this is an accurate description (sounds good to me) then the advancement brought about by the Singularity can be considered the ultimate form of knowledge management. One of the most difficult parts of managing memories and experiences is recording in a way that is relevant. The medium is the hindrance, not the experience.

Maryam Alavi proposes that different frameworks for managing knowledge exist. One for Tacit knowledge, which is internalized knowledge that an individual may not be consciously aware of (http://mmlab.ceid.upatras.gr/courses/AIS_SITE/files/projects2004/paper711/14_KM_KMsystems_Alavi_MISQ.pdf). And another for explicit knowledge, which can easily be communicated to others. The latter is usually considered to be easy to manage, but the former is almost always elusive. People have things that they do easily, but struggle to explain. But regardless of whether someone can explain their actions or not, the instructions exist somewhere in the brain. The fall of the boundary between computers and human beings proposed by the Singularity would make this knowledge instantly available.

More than just recording information, knowledge management is about making it useful and relevant for others. The community for knowledge has already been greatly expanded with the advent of the internet. A technology that scans deeper and more thoroughly into the brain would expand the movement even further. Knowledge recorded through Singularity technology could be made widespread and ubiquitous at the speeds faster than knowledge is made available through the internet today.

Why not today?

Despite the best promises of the Singularity movement, it can be easily argued that it is based on pure hypothesis, and carries little water. Humans today have understanding of the brain in theory, but seemingly have little ability to actually affect or study it. The field of computer science has developed to a large extent, but still remains unable to solve anything but mundane human cognitive abilities. And hardware inadequacy is not the culprit.

In 2009, one of the most powerful computer systems available in the world is the supercomputer “Jaguar” at Oak Ridge National Laboratory (http://ornl.gov/). It has a peak performance 1750 teraflops. Hans Morvec, of the Robotics Institute of Carnegie Mellon University estimated the human brain's processing power to be around 100 teraflops, roughly 100 trillion calculations per second (http://www.wired.com/techbiz/it/news/2002/11/56459). All this means is that it isn’t just about hardware speed, but rather a careful mix of both powerful hardware and intelligent software which may one day produce Singularity type machines.

The ultimate test to prove computer intelligence is known as the “Turing test”, which is based on the design of Alan Turing (http://en.wikipedia.org/wiki/Turing_test). In the test, a single human asks questions remotely to a computer contestant and a human contestant. If the human is unable to distinguish between the computer and human contestants, then the computer passes the test. No computer has been able beat this test, which is almost a pre-requisite Singularity type machines.


Ray Kurzweil and other futurists argue that the singularity will bring computers with all the problem-solving ability, proficiency, and emotional intelligence of the human brain. This would be so far advanced of what is available today that it is almost impossible to imagine. Predicting the future is always difficult, but in the case of the Singularity it’s even more difficult. There just isn’t anything else that has happened in the past that it can be compared to. The best hope at predicting the plausibility of such an era occurring is to look backwards at the rate and speed at which new inventions have appeared and become widely available.

Statistical evidence proves that both of these figures have been increasing exponentially, with less time between each progressive invention and revolution. A good example of this is the fact that it took over 30 years from the initial release of the television till it was in more than 25 percent of American’s houses, while it took less than a third that amount of time for the internet to become so prolific (The singularity is near, 50). The speed of widespread communication, among dozens of other factors is drastically lessoning the amount of time it takes for revolutionary products to become ubiquitous among the population.

As these products that bear Singularity-like features are released at faster rates the likelihood of an event like this happening becomes more realistic. Already, in the past couple decades, the thought of computers being everywhere has been realized. Fifty years ago computer chips were huge and mostly restricted to scientists and government ardencies. Today, almost every US citizen has access to multiple computer devices on a daily basis. It isn’t a stretch to imagine that in the next few decades, computer technology may become even more prolific, even directly within the human body and mind.

Works Cited

Kurzweil, Ray. The Singularity Is Near: When Humans Transcend Biology. New York: Viking, 2005. Print.

Oak Ridge National Laboratory. Web. 09 Mar. 2010. <http://ornl.gov/>.

"The Secret Life of the Brain : History of the Brain." PBS. Web. 09 Mar. 2010. <http://www.pbs.org/wnet/brain/history/2500bc.html?position=179?button=2>.

"The Secret Life of the Brain : History of the Brain." PBS. Web. 09 Mar. 2010. <http://www.pbs.org/wnet/brain/history/450bc.html?position=208?button=4>.

"The Technological Sphere." Anders Sandberg's Web. Web. 09 Mar. 2010. <http://www.aleph.se/Trans/Tech/index-2.html>.

"This Is Your Computer on Brains." Wired News. Web. 09 Mar. 2010. <http://www.wired.com/techbiz/it/news/2002/11/56459>.

Web. <http://web.archive.org/web/20060525195404/http://www.idi.ntnu.no/emner/dif8916/denning.pdf>.

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License