Asymptotic
Analysis
Main idea:
Focus on how the runtime scales with n (the input size).
(Heuristically: only pay attention to the
Some examples… largest function of n that appears.)
Asymptotic Running
Number of operations Time
Why is this a good idea?
• Suppose the running time of an algorithm is:
ms
This constant factor of 10
depends a lot on my
computing platform… These lower-order
terms don’t really
matter as n gets large.
We’re just left with the n2 term!
That’s what’s meaningful.
Pros and Cons of Asymptotic
Analysis
Pros: Cons:
• Abstracts away from • Only makes sense if n is
hardware- and language- large (compared to the
specific issues. constant factors).
• Makes algorithm analysis
much more tractable. 1000000000 n
• Allows us to meaningfully is “better” than n2 ?!?!
compare how algorithms will
perform on large inputs.
pronounced “big-oh of …” or sometimes “oh of …”
Informal definition for
• A function grows no faster than a certain rate
Formal definition for
• For a given function of n,
• is the set of functions such that,
= {
there exist positive
constants and such that
for all
}
Formal definition for
• Let , be functions of positive integers.
• Think of as a runtime: positive and increasing in n.
• We say “ is ” if:
for all large enough n,
is at most some constant multiple of .
Here, “constant” means “some number
that doesn’t depend on n.”
Example for large enough n,
is at most some constant
multiple of .
10
2 +
2 n
) =
T( n
g(n) = n2
Example for large enough n,
is at most some constant
multiple of .
3g(n) = 3n2
10
2 +
2 n
) =
T( n
g(n) = n2
Example for large enough n,
is at most some constant
multiple of .
n0=4
3g(n) = 3n2
10
2 +
2 n
) =
T( n
g(n) = n2
Formal definition of
• Let , be functions of positive integers.
• Think of as a runtime: positive and increasing in n.
• Formally,
“If and only if” “For all”
“There exists”
“such that”
Example
10
2 +
2 n
) =
T( n
g(n) = n2
Example
3g(n) = 3n2
(c=3)
10
2 +
2 n
) =
T( n
g(n) = n2
Example
n0=4
3g(n) = 3n2
(c=3)
10
2 +
2 n
) =
T( n
g(n) = n2
Example
n0=4
Formally:
3g(n) = 3n 2
• Choose c = 3
• Choose n0 = 4
0 • Then:
2 +
1
2 n
) =
T( n
g(n) = n2
Same example
Formally:
7g(n) = 7n2 • Choose c = 7
• Choose n0 = 2
0 • Then:
2 +
1
2 n
) =
T( n
g(n) = n2
n0=2
O(g(n)) is an upper bound
g(n) = n2 • Choose c = 1
• Choose n0 = 1
• Then
T(n) = n
Informal definition for
• A function grows at least as fast as a certain rate
Formal definition for
• For a given function of n, g(n)
• is the set of functions such that,
= {
f(n): there exist positive
constants c and such that
for all
}
Ω(g(n)) means a lower
bound
• We say “ is ” if, for large enough n, is at least as big
as a constant multiple of .
• Formally,
Switched these!!
Example
3n • Choose c = 1/3
) =
g( n • Choose n0 = 2
n)
g ( • Then
nl o
) =
n
T(
= n
g(n)/3
Informal definition for
• A function grows precisely at a certain rate
Θ(g(n)) means both!
• We say “ is ” iff both:
and
Summary of Asymptotic
Notations
Non-Example:
is not
• Proof by contradiction:
• Suppose that
• Then there is some positive c and n0 so that:
• Divide both sides by n:
• That’s not true!!! What about ?
• Then , but .
• Contradiction!
Take-away from examples
• To prove T(n) = O(g(n)), you have to come up with c
and n0 so that the definition is satisfied.
• To prove T(n) is NOT O(g(n)), one way is proof by
contradiction:
• Suppose (to get a contradiction) that someone gives you
a c and an n0 so that the definition is satisfied.
• Show that this someone must be lying to you by deriving
a contradiction.
Formal definition of
• For a given function of n,
• o is the set of functions such that,
o
= {
there exist positive
constants and such that
for all
}
Formal definition of
• For a given function of n,
• is the set of functions such that,
= {
there exist positive
constants and such that
for all
}
Common Bounds
Some Notations
• Asymptotic notations are defined as sets.
• But we use,
instead of
• We can also write,
• Provide the simplest and most precise bounds
possible
Reference
• CLRS Chapter 3