Search This box

Ads

7/08/2013

COGNITIVE COMPUTING

Cognitive computing

Artificial intelligence meets business intelligence


What is cognitive computing?


Cognitive computing systems are systems that learn and interact naturally with people to extend what either man or machine could do on their own. They help human experts make better decisions by penetrating the complexity of Big Data.
Big Data is increasing in volume, speed and uncertainty. It comes in unstructured forms such as video, image and text. A new type of computing system is needed in order to understand, process and make sense of it.
Cognitive computers are not programmed to perform a function or set of tasks; rather, they use artificial intelligence (AI) and machine learning algorithms to sense, predict and, in some ways, think. This allows these systems to comprehend and draw insight from Big Data. In order to handle this type of processing, cognitive computers require new hardware innovations in which data processing is distributed throughout the system and memory and processing are more tightly integrated. Eventually, entirely new architectures will be modeled after the way the human brain processes information.
Cognitive computing isn't about the computer becoming the primary expert as much as assisting the human experts. By having deep domain expertise in fields such as healthcare, banking and commerce, and using data visualization techniques, cognitive computing helps humans to solve complex problems and make sense of big data. Cognitive computing systems get smarter the more they are used.
Read chapter one of "Smart Machines: IBM's Watson and the Era of Cognitive Computing" by Director of IBM Research, John E. Kelly III and IBM writer Steve Hamm

Director of IBM Research, John E. Kelly III discusses Watson and cognitive computing with Computer History Museum CEO John Hollar (1:18:23)

Learn more about this new era of cognitive computing (4:46)

IBM Watson: The Science Behind an Answer (6:43)

IBM Watson: The Face of Watson (5:26)

Announcing the IBM Watson Engagement Advisor (0:58)

Memorial Sloan-Kettering and IBM Watson to Advance Cancer Care (2:08)
The first cognitive computing system: Watson
The first cognitive computer was Watson, which debuted in a televised Jeopardy! challenge where it bested the show’s two greatest champions. The challenge for Watson was to answer questions posed in every nuance of natural language, such as puns, synonyms and homonyms, slang, and jargon.
Watson was not connected to the Internet for the match. It only knew what it had amassed through years of persistent interaction and learning from a large set of unstructured knowledge. Using machine learning, statistical analysis and natural language processing to find and understand the clues in the questions, Watson then compared possible answers, by ranking its confidence in their accuracy, and responded – all in about three seconds.
Newer generations of Watson are currently being trained in oncology diagnosis for healthcare professionals, and in customer service as a support representative.
At Cleveland Clinic, Watson is partnering with medical students using a visualization technology called Watson Paths
IBM Journal of Research and Development: This is Watson

Domain expertise
Unlike expert systems of the past which required rules to be hard coded into a system by a human expert, cognitive computers can process natural language and unstructured data and learn by experience, much in the same way we do. While they’ll have deep domain expertise, instead of replacing human experts, cognitive computers will act as a decision support system and help them make decisions, whether in healthcare, finance or customer service.

How machine learning can identify financial fraud cases

Using voice recognition to detect and provide treatment for dementia

Overview of the SyNAPSE project (5:16)
From programmable computers to cognitive computers
Computers today are just very large, very fast number crunchers and information manipulators. They can process lots of data, but they really don’t think. They’ve all adhered to the Von Neumann model, a method of constructing a computer by separating memory and processing and calculating a series of "if X then do Y" equations that have been prewritten.
Cognitive computing bypasses some elements of the Von Neumann model by drawing inspiration from the human brain. Humans can do things in parallel - memory and processing are intimately intertwined and there’s no program telling us what to do - all because of the way our brains are configured. In fact the human brain can perform complex tasks rapidly and accurately using the same amount of energy as a 20 watt light bulb in a space equivalent to a 2 liter soda bottle. It’s these functions and capabilities of the human brain that will enable cognitive computing.
IBM scientists are working on a project called SyNAPSE to reproduce the structure and architecture of the brain—the way neurons receive sensory input, connect to each other, adapt these connections, and transmit mental and motor output. The goal is to model computing systems that emulate the brain's computing efficiency, size and power usage without being programmed.

Computers that can sense the natural world
Even with the processing power and algorithms to make sense of a large volume of unstructured data, computers need a way to interact with the natural world to consume that raw data. Image recognition and speech recognition give computers the eyes and ears to understand our world. Through computer vision, natural language processing and text mining they process what they see and hear, allowing them to extract meaning and decode human expression.

Communicating complexity
Our brains are amazing – but when faced with processing an ever growing barrage of data surrounding us, our capacity suddenly seems very finite. Cognitive computing can help push those boundaries of human cognition. By using visual analytics and data visualization techniques, cognitive computers can display data in a visually compelling way that enlightens humans and helps them make decisions based on data. The same image recognition and speech recognition that allows a computer to make sense of unstructured data also allows it to interact more seamlessly with humans. It provides a feedback loop for machines and man to learn from and teach one another.

No comments:

Post a Comment