Dr.
YSR ARCHITECTURE AND FINE ARTS UNIVER
QUANTUM
COMPUTING
P.KARTHIK REDDY UNDER THE GUIDANCE
DTDP OF
21111DT0017 M.PADMAJA M.Tech.,
SEMESTER-VIII (Ph.D)
Table of Content
Introduction
History
Classic vs Quantum computing
Principles of Quantum Computing
Advantages &
Disadvantages
Applications
Conclusion
Introduction
What does "quantum" mean?
The word "quantum" in a quantum computer, originates from "quantum mechanics," a
basic theory in physics. In brief, on the scale of atoms and molecules, matter behaves in a
quantum manner.
What is a Quantum Computer?
A quantum computer is a machine that performs calculations based on the laws of
quantum mechanics, which is the behavior of particles at the sub-atomic level.
Quantum computing is an advanced technology that uses qubits instead of traditional bits.
It leverages superposition (processing multiple states at once) and entanglement (strong
qubit connections) to solve complex problems faster than classical computers.
It has potential applications in cryptography, AI, drug discovery, and optimization, making
History
Conceptual Key Algorithms
Developed(1990s) Early
Foundations(1980s
Experiments(2000s)
)
Richard Feynman Researchers built basic
Peter Shor and Lov
and Yuri Manin quantum circuits and
Grover developed
proposed that D-Wave introduced the
groundbreaking
quantum systems first quantum annealer
algorithms showing
could be simulated for solving
how quantum
using quantum optimization problems.
computers could
computers. outperform classical
ones in factoring and
searching.
History
Quantum Quantum supremacy Towards Practical
Processors Achieved(2019) Quantum
Emerge(2010s) Computing(2020s &
Beyond) shifted
Focus to
Companies like IBM Google announced
and Google building scalable,
quantum supremacy by
developed small- error-corrected
solving a complex
scale quantum quantum systems with
problem faster than the
processors and real-world applications
world’s most powerful
began offering in AI, cryptography,
supercomputer.
access through the and drug discovery
cloud
Classic vs Quantum Computing
Feature Classic Computing Quantum Computing
Qubit (0, 1, or both at the
Basic Unit Bit (0 or 1)
same time)
Working Uses binary logic and
Based on quantum mechanics
Principle classical physics
Processing Processes one state at a Can process many states at
Power time once (superposition)
Slower for complex Much faster for certain
Speed
problems complex problems
Stores multiple possibilities in
Data Storage Stores definite values
one qubit
Web browsing, word Cryptography, AI, drug
Examples of Use
processing, games discovery, complex simulations
Prone to errors, still in early
Stability Stable and reliable
development
Principles of Quantum Computing
Quantum computing is based on the principles of quantum mechanics.
Unlike classical computers that use bits (Os and 1s), quantum computers use qubits, which
can exist in multiple states simultaneously.
The core principles of quantum computing are:
Qubits Superposition Quantum
Entanglement
Quantum Quantum Quantum
Interferen Parallelism Measurement &
ce Wavefunction
Collapse
Principles of Quantum Computing
Qubits:
Classical computers use binary bits (0 or 1).
Qubits, however, can be in the superposition of both 0 and 1 simultaneously.
Qubits are implemented using quantum particles like electrons, photons, or
superconducting circuits.
Superposition:
A qubit can exist in multiple states at the same time (both 0 and 1).
This allows quantum computers to process vast amounts of data in parallel.
Example: If a classical bit is like a coin that lands either on heads (0) or tails (1), a qubit
is like a spinning coin that can be both at once until observed.
Mathematically: A qubit state is written as:
∣𝜓⟩=𝛼∣0⟩+𝛽∣1⟩
where α and β are probability amplitudes.
Principles of Quantum Computing
Quantum Entanglement:
When two qubits become entangled, their states become linked, regardless of distance.
Changing one qubit instantaneously affects the other.
This is used for secure communication (Quantum Cryptography) and speeding up
computations.
Example: Imagine two entangled qubits: if one is measured as 0, the other will
immediately be 1, even if they are light-years apart.
Quantum Interference:
Quantum states interfere with each other, affecting the probability of different
outcomes.
Quantum algorithms amplify correct results while canceling incorrect ones.
It is used in Shor’s Algorithm (for factorizing large numbers) and Grover’s Algorithm (for
searching databases faster than classical computers).
Principles of Quantum Computing
Quantum Parallelism:
Due to superposition, quantum computers can perform multiple calculations at once.
This enables exponential speedup for certain problems compared to classical computers.
Example: A 100-qubit quantum computer can process 2¹⁰⁰ states simultaneously, which
is more than the atoms in the universe!
Quantum Measurement & Wavefunction collapse:
When a qubit is measured, it collapses into either 0 or 1 with certain probabilities.
Measurement destroys superposition, meaning quantum computations must be designed
carefully to extract useful information before measurement.
Example:
Before measurement: ∣ψ⟩=12(∣0⟩+∣1⟩)
After measurement: Either |0⟩ (50%) or |1⟩ (50%)
Adavntages & Disadvantages
Advantages Disadvantages
Speed & Parallelism Complexity
Solving Complex Problems Hardware Challenges
Enhanced Cryptography Decoherence
Security High Cost
Innovation Instability
Simulation of Quantum Threat to classic Encryption
Systems
Applications
Cryptography Aerospace & Defense
Drug discovery Material Science
Machine Learning Secure Communication
Financial Modeling Energy grid management
Climate Modeling Genome Analysis
Conclusion
Quantum computing uses the principles of quantum mechanics to process
information in powerful new ways.
It offers massive speed and efficiency for solving complex problems that classical
computers struggle with.
Applications include cryptography, drug discovery, machine learning, optimization,
and more.
Quantum technology is still developing and faces challenges like qubit stability and
error correction.
In the future, quantum computers are expected to become faster, more reliable, and
widely accessible.
Thank You