What Is Cognitive Computing?

What Is Cognitive Computing?

December 3

Almost four years ago, IBM’s Watson computer took on the top two all-time human winners of Jeopardy (the American game show) history—and won handily. Ken Jennings, who holds the all-time Jeopardy winning streak at 74 games, wrote in a Slate article titled My Puny Human Brain that he had misjudged Watson’s cognitive abilities:

The computer’s techniques for unraveling Jeopardy clues sounded just like mine. That machine zeroes in on key words in a clue, then combs its memory (in Watson’s case, a 15-terabyte data bank of human knowledge) for clusters of associations with those words. It rigorously checks the top hits against all the contextual information it can muster: the category name; the kind of answer being sought; the time, place, and gender hinted at in the clue; and so on. And when it feels “sure” enough, it decides to buzz. This is all an instant, intuitive process for a human Jeopardy player, but I felt convinced that under the hood my brain was doing more or less the same thing.

Within a week after this series of episodes were aired, Nuance licensed Watson’s technology to build health care apps, and in January 2014, IBM launched Watson Business Group, a new division devoted to developing cloud-based cognitive applications for businesses. In its press release IBM also said it would be investing $1 billion into Watson Group, $100 million of which would be marked to invest in startups building applications to run on Watson’s Developer cloud.

With all this talk about Watson (interesting fact: Watson was named after IBM’s first CEO Thomas J. Watson, not Dr. Watson of Sherlock Holmes fame, as I had assumed), you may be wondering what “Cognitive Computing” is all about and how it could affect your life in the future. Let’s see if my puny human brain can give you an overview.

Multiple Definitions

The first definition I read for Cognitive Computing was a quote from current IBM CEO Ginny Rometty that I found from a PCWorld article by Joab Jackson covering the announcement of Watson Business Group:

Today we are in the “programmable era” of computers, in which all the possible actions that a computer can take must be programmed in advance, [Rometty] explained. In contrast, Watson is “a new species,” [that] is taught—it is not programmed. It runs by experience and from interaction. By design, it gets smarter over time and gives better judgments over time,” Rometty said.

While this definition seems clear-cut, it isn’t the only definition out there. ProgrammableWeb reporter Janet Wagner says Deloitte University Press describes cognitive computing as having three main components: machine learning, natural language processing, and advancements in the enabling infrastructure.

Wagner also refers to a May 2014 YouTube video, in which IBM Research senior vice president John E. Kelly III and IBM writer Steve Hamm elaborate on Rometty’s explanation:

Kelly: Cognitive computing [is] not just a new computing system or computing paradigm but a whole new era of computing…[T]he explosion of data in the world and the rate and pace of change…has outstripped our ability to reprogram these systems…We’ve coined this [era] “cognitive” because it has attributes that are more like human cognition. These are not systems that are programmed; they’re systems that learn. These are not systems that require data to be neatly structured in tables or relational databases. They can deal with highly unstructured data, from tweets to signals coming off sensors.

Hamm: Cognitive computing is not artificial intelligence. … Cognitive computing has the modesty of not trying to replicate the human brain.


Hamm’s distinction is an interesting one. While Jeopardy contestant Jennings may see similarities between the way his mind works coming up with the correct question and the way Watson does, Watson and its immediate descendants are not following the same cognitive pathways a human brain can, if only because we still haven’t gotten to the bottom of how the brain works.

At the same time, technology that is capable of learning, rather than having to be programmed, is a huge deal. In a recent article for MIT Technology Review, Will Knight writes that cognitive computing platforms like Watson could mine vast amounts of data” to find answers to various questions or other useful insights.

And the IBM web page Why Cognitive Systems? addresses why this capability could be a huge benefit for humans (without having to worry about HAL-like complications):

These systems will learn and interact to provide expert assistance to scientists, engineers, lawyers, and other professionals in a fraction of the time it now takes. Far from replacing our thinking, cognitive systems will extend our cognition and free us to think more creatively. In so doing, they will speed innovations and ultimately help build a Smarter Planet.

In my next post, I’ll look at some real-world applications for cognitive computing systems like Watson. Until then, let’s hear your thoughts about cognitive computing in the comments or on Twitter!

Photo credit: Neil Conway via Flickr