A machine will not hug you… but it may listen and offer advice

Norbert Biedrzycki
7 min readAug 13, 2017

Blog

As its users, we have grown to take technology for granted. Hardly anything these days is as commonplace and unremarkable as a personal computer that crunches numbers and enables us to read files and access the Internet. Will computers ever amaze us again in any way? Some potential for amazement may lie in cognitive computing — a skill-set widely considered to be the most vital manifestation of artificial intelligence.

Back during my university days, and later at the outset of my professional career, I wrote software. I earned my first paycheck as a programmer. I often stayed up late and even pulled all-nighters correcting endless code errors. There were times when the code I wrote finally began to do just what I wanted it to, serving its intended purpose. In time, such moments became more and more frequent. I often wondered if programmers would ever be replaced. But how and with what? The science fiction literature I was into abounded with stories on robots, artificial intelligence and self-learning technologies that overstepped their boundaries and began to act against the rules, procedures and algorithms. Such technologies managed to learn from their mistakes and accumulate experience. It was all science fiction then. A computer program that did anything other than the tasks assigned to it by its programmer? What a delusion. But then I came across other concepts, such as self-learning machines and neural networks.

As it turns out, a computer program may amass experience and apply it to modify its behavior. In effect, machines learn from experience that is either gained directly by themselves or implanted into their memories. I have learned about algorithms that emulate the human brain. They self-modify in the search for the optimal solutions to given problems. I have learned about cognitive computing, and it is my reflections on this topic that I would like to share in this article.

As it processes numbers, a computer watches my face

All the existing definitions of cognitive computing share a few common features. Generally speaking, the term refers to a collection of technologies that result largely from studies on the functioning of the human brain. It describes a marriage of sorts of artificial intelligence and signal processing. Both are key to the development of machine consciousness. They embody advanced tools such as self-learning and reasoning by machines that draw their own conclusions, process natural language, produce speech, interact with humans and much more. All these are aspects of collaboration between man and machine. Briefly put, the term cognitive computing refers to a technology that mimics the way information is processed by the human brain and enhances human decision-making.

Cognitive computing. What can it be used for?

Cognitive computing emulates human thinking. It augments the devices that use it while empowering the users themselves. Cognitive machines can actively understand such language and respond to information extracted from natural language interactions. They can also recognize objects, including human faces. Their sophistication is unmatched by any product ever made in the history of mankind.

Time for a snack, Norbert

In essence, cognitive computing is a set of features and properties that make machines ever more intelligent and, by the same token, more people-friendly. Cognitive computing can be viewed as a technological game changer and a new, subtle way to connect people and the machines they operate. While it is neither emotional nor spiritual, the connection is certainly more than a mere relationship between subject and object.

Owing to this quality, computer assistants such as Siri (from Apple) are bound to gradually become more human-like. The effort to develop such features will focus on the biggest challenge of all faced by computer technology developers. This is to make machines understand humans accurately, i.e. comprehend not only the questions people ask but also their underlying intentions and the meaningful hints coming from users who are dealing with given problems. In other words, machines should account for the conceptual and social context of human actions. An example? A simple question about the time of day put to a computer assistant may soon be met with a matter-of-fact response followed up by a genuine suggestion: “It is 1:30pm. How about a break and a snack? What do you say, Norbert?”

Picture: How Siri works? Source: HowTechnologyWork

Dear machine — please advise me

I’d like to stop here for a moment and refer the reader to my previous machine learning article. In it, I said that machine technology enables computers to learn, and therefore analyze data more effectively. Machine learning adds to a computer’s overall “experience”, which it accumulates by performing tasks. For instance, IBM’s Watson, the computer I have mentioned on numerous occasions, understands natural language questions. To answer them, it searches through huge databases of various kinds, be it business, mathematical or medical. With every successive question (task), the computer hones its skills. The more data it absorbs and the more tasks it is given, the greater its analytical and cognitive abilities become.

Machine learning is already a sophisticated, albeit very basic machine skill with parallels to the human brain. It allows self-improvement of sorts based on experience. However, it is not until cognitive computing enters the picture that users can truly enjoy interactions with a technology that is practically intelligent. The machine not only provides access to structured information but also autonomously writes algorithms and suggests solutions to problems. A doctor, for instance, may expect IBM’s Watson not only to sift through billions of pieces of information (Big Data) and use it to draw correct conclusions, but also to offer ideas for resolving the problem at hand.

At this point, I would like to provide an example from daily experience. An onboard automobile navigation system relies on massive amounts of topographic data which it analyzes to generate a map. The map is then displayed, complete with a route from the requested point A to point B, with proper account taken of the user’s travel preferences and prior route selections. This relies on machine learning. However, it is not until the onboard machine suggests a specific route that avoids heavy traffic, while incorporating our habits that it begins to approximate cognitive computing.

Number crunching is not everything

All this is fine, but where did today’s engineers get the idea that computers should do more than crunch numbers at a rapid pace? The head of IBM’s Almaden Research Center Jeffrey Welser, who has spent close to five decades developing artificial intelligence, offered this simple answer: “The human mind cannot crunch numbers very well, but it does other things well, like playing games, strategy, understanding riddles and natural language, and recognizing faces. So we looked at how we could get computers to do that”.

Efforts to use algorithms and self-learning to develop a machine that would help humans make decisions have produced a spectacular effect. In designing Watson, IBM significantly raised the bar for the world of technology.

How do we now apply it?

The study of the human brain, which has become a springboard for advancing information technology, will — without a doubt — have broader implications in our lives, affecting the realms of business, safety, security, marketing, science, medicine and industry. “Seeing” computers that understand natural language and recognize objects can help everyone, from regular school teachers to scientists searching for a cure for cancer. In the world of business, the technology should — in time — help use human resources more efficiently, find better ways to acquire new competencies and ultimately loosen the rigid corporate rules that result from adhering to traditional management models. In medicine, much has already been written on doctors’ hopes associated with the excellent analytical tool — IBM’s Watson. In health care, Watson will go through a patient’s medical history in an instant, help diagnose health conditions and enable doctors to instantly access information that could previously not be retrieved within the required time horizon. This may become a major breakthrough in diagnosing and treating diseases that cannot yet be cured.

Watson has attracted considerable interest from the oncology community, whose members have high hopes for the computer’s ability to rapidly search through giant cancer databases (which is crucial in cancer treatment) and provide important hints to doctors.

Combined with quantum computing, this will become a robust tool for solving complex technological problems. Even today, marketing experts recognize the value of cognitive computing systems, which are playing an increasingly central role in automation, customer relationships and service personalization. Every area of human activity in which data processing, strategic planning and modeling are of importance, will eventually benefit from these technological breakthroughs.

The third age of machines

Some people go as far as to claim that cognitive computing will begin the third age of IT. Early in the 20th century, computers were seen as mere counting machines. Starting in the 1950s, they began to rely on huge databases. In the 21st century, computers learned to see, hear and think. Since human thinking is a complex process whose results are often unpredictable, perhaps we could presume that a cognitive union of man and machine will soon lead to developments that are now difficult to foresee.

Machines of the future must change the way people acquire and broaden their knowledge, to achieve “cognitive” acceleration. However, regardless of what the future may bring, the present day, with its ever more efficient thinking computers is becoming more and more exciting.

Exponential growth in technological capabilities will lead to Singularity. Source: Sneaky Magazine

Related articles:

- Only God can count that fast — the world of quantum computing

- Machine Learning. Computers coming of age

- The invisible web that surrounds us, i.e. the Internet of Things

- According to our computers … You don’t exist

- The lasting marriage of technology and human nature

- The brain — the device that becomes obsolete

- How machines think

--

--

Norbert Biedrzycki

Technology is my passion. Head of Microsoft Services CEE. Private opinions only