Final Seminar Cognitive Computing
Final Seminar Cognitive Computing
Seminar On
Cognitive Computing.
PRESENTED BY
Guided by
DR.SONALI KULKARNI
(Assistant Professor)
Contents:
Term Cognitive?
What is cognitive Computing?
Properties :- Adaptive,
Interactive,
Iterative and Stateful ,
Contextual.
Cognitive Computing is combination of principles of
. Neuroscience,
. Nanotechnology,
. Super Computing.
Content:
Brain Inspired Architecture.
Event driven non von Neumann architecture.
Cognitive Computing Power Efficient architecture.
Technologies integrated in cognitive computing.
IBM Watson.
What do we mean by term Cognitive?
* The basic idea behind this type of computing is that to develop the computer
systems (include hardware and software) who interacts with human like humans.
* These computer can recognize, understand, analyze and take out the best
possible result as or near about the human brain.
Properties:
Adaptive.
These systems must be flexible enough to learn as information changes and as
goals evolve. They must digest dynamic data in real time and adjust as the data and
environment change.
Interactive.
Human-computer interaction is a critical component in cognitive systems. Users
must be able to interact with cognitive machines and define their needs as those
needs change. The technologies must also be able to interact with other processors,
devices and cloud platforms.
Iterative and stateful.
Cognitive computing technologies can ask questions and pull in additional
data to identify or clarify a problem. They must be stateful in that they keep
information about similar situations that have previously occurred.
Contextual.
Understanding context is critical in thought processes. Cognitive systems must
understand, identify and mine contextual data, such as syntax, time, location,
domain, requirements and a user's profile, tasks and goals. The systems may draw
on multiple sources of information, including structured and unstructured data
and visual, auditory and sensor data.
Cognitive computing refers to the development of
computer system model after the human brain.
Neuroscience.
Super computing.
Nanotechnology.
Neuroscience
* Neuroscience deals with the study of mind, study of the neural systems.
* The devices based on this architecture consist of the electronic neurons and
synapse and are called Neuro-synaptic chips.
* So to achieve such high performance super computing algorithm and hardware needs
in cognitive computing.
* The Cognitive computing chip, designed to emulate the neurons and synapses
(connections) in the human brain.
* Individual cores can fail and yet, like the brain, the architecture can still function.
Event Driven non Von Neumann Architecture.
• This new neurosynaptic chip is event-driven and operates only when it needs to,
resulting in a cooler operating environment and lower energy use.
• Non Von Neumann architecture is that it embed the memory with processing unit
which is different.
The chip is especially designed for low power consumption, which can clearly
be seen in this thermal image that shows the cool cognitive chip is in blue color and
heat up traditional chip in red.
• This new architecture represents a critical shift away form today’s traditional von
Neumann computers, to extremely power-efficient architecture.
• Goal is to build a chip system with 10 billion neurons and 100 trillion synapses that
consumes just one kilowatt-hour of electricity.
Technologies integrated in Cognitive computing to mimic capabilities of
Human brain?
•Parallel Computing
•Data Mining
•Machine Learning
•Natural Language Processing
Parallel Computing :
• Human brain do not works in the sequential format i.e. it doesn’t performs the thing
one by one but it do all the things in paralally.
• So using parallel computing we allow the cognitive machines adapts the parallel
architecture and parallel algorithms.
• To cognitive computing chips have 256 million neurons, an array of 256 by 256 (or a
total of 65536) synapses(connection in brain).
• Today we have very large amount of data which is in terabytes, petabyte and in future
we will have it in zeta or yottabyte which is noisy and unstructured data.
• So to get the best possible results or knowledge, data mining is introduced in cognitive
computing.
Machine Learning.
• Machine learning is a subfield of computer science that evolved from the
study of pattern recognition and computational learning theory in artificial
intelligence.
• Cognitive systems with this learn with experience , its input and output data.
• It is the field of study that gives computers the ability to learn without
being explicitly programmed.
Natural Language Processing
• Processing of the human generated language by computer.
• Natural language processing (NLP) is a field of computer science, artificial intelligence,
and computational linguistics concerned with the interactions between computers and
human (natural)
languages.
• These cognitive system take the input in the human understandable language process
it with NLP algorithms and give the results in human understandable language.
• To get data from the internet which have large amount of data in natural language
cognitive chips use NLP .
IBM Watson
• IBM Watson is cognitive computing machine
made by the IBM.