0% found this document useful (0 votes)
15 views

Complexity Analysis 2024

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views

Complexity Analysis 2024

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 40

Algorithm Complexity Analysis

Chapter 22

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
1
Objectives
To estimate algorithm efficiency using the Big O notation.
To explain growth rates and why constants and non-dominating
terms can be ignored in estimating algorithm efficiency
To determine the complexity of various types of algorithms;
– binary search algorithm
– selection sort algorithm
– insertion sort algorithm
– Towers of Hanoi algorithm
To describe common growth functions (constant, logarithmic, log-
linear, quadratic, cubic, exponential)
To compare different algorithms for the following problems;
– Finding Fibonacci numbers
– Finding the Greatest Common Divisor (GCD).

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
2
Algorithm Efficiency
Suppose two algorithms perform the same task such as search
(linear search vs. binary search) or sorting (selection sort vs.
insertion sort).
– Which one is better (i.e which one uses less memory and take less time)?
One possible approach to answer this question is to implement these
algorithms and run the programs to get execution time and memory
usage. But there are two problems with this approach:

1. there are many tasks running concurrently on a computer. The execution


time/memory usage of a particular program is dependent on the system load.
2. the execution time/memory usage is dependent on specific input.

Consider linear search and binary search for example.


– If an element to be searched happens to be the first in the list, linear search
will find the element quicker than binary search, but
– If the element is not in the list binary search would be faster.
Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
3
Growth Rate
It is very difficult to compare algorithms by measuring
their execution time/memory usage.
To overcome these problems, a theoretical approach
was developed to analyze algorithms independent of
computers and specific input.
This approach approximates the effect of a change on
the size of the input on resource usage.
In this way, you can see how fast an algorithm’s
execution time/ memory usage increases as the input
size increases, so you can compare two algorithms by
examining their growth rates.

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
4
Big O Notation
Consider linear search;
The linear search algorithm compares the key with the
elements in the array sequentially until the key is found or the
array is exhausted.
– If the key is not in the array, it requires n comparisons for an array of
size n.
– If the key is in the array, it requires n/2 comparisons on average.

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
5
Big O Notation….
The algorithm’s execution time is proportional to the size of
the array.
– If you double the size of the array, you will expect the number of
comparisons to double.
The algorithm grows at a linear rate. The growth rate has an
order of magnitude of n.
Computer scientists use the Big O notation to abbreviate
“order of magnitude.” Using this notation, the complexity of
the linear search algorithm is O(n), pronounced as “order of
n.”

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
6
Ignoring Multiplicative Constants
The linear search algorithm requires n comparisons in the worst-case and
n/2 comparisons in the average-case. Using the Big O notation, both
cases require O(n) time.
The multiplicative constant (1/2) can be omitted. Algorithm analysis is
focused on growth rate.
The multiplicative constants have no impact on growth rates.
– The growth rate for n/2 or 100n is the same as n, i.e., O(n/2) = O(100n) = O(n) .

f(n)
n n/2 100n
n

100 100 50 10000

200 200 100 20000

2 2 2 f(200) / f(100)

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
7
Ignoring Non-Dominating Terms
Consider the algorithm for finding the maximum number in
an array of n elements.
If n is 2, it takes one comparison to find the maximum
number.
If n is 3, it takes two comparisons to find the maximum
number.
In general, it takes n-1 comparisons to find maximum number
in a list of n elements.
As n grows larger, the n part in the expression, n-1,
dominates the complexity function. The Big O notation
allows you to ignore the non-dominating part (e.g., -1 in the
expression n-1) and highlight the important part (e.g., n in the
expression n-1). So, the complexity of this algorithm is O(n).
In mathematics this is known as asymptotic analysis.
Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
8
Asymptotic Analysis
To illustrate complexity analysis, consider:
f(n) = n2 + 100n + log10n + 1000
As the value of n increases, the importance of each term
shifts until for large n, only the n2 term is significant

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All 9
rights reserved.
Best, Worst, and Average Cases
For the same input size, an algorithm’s execution time may vary,
depending on the input.
– An input that results in the shortest execution time is called the best-case
input,
– an input that results in the longest execution time is called the worst-case
input.
– Best-case and worst-case are not representative, but worst-case analysis is
very useful. You can show that the algorithm will never be worse than the
worst-case.
– An average-case analysis attempts to determine the average amount of time
among all possible input of the same size. Average-case analysis is ideal, but
difficult to perform, because it is hard to determine the relative probabilities
and distributions of various input instances for many problems.
– Worst-case analysis is easier to obtain and is thus common. So, the analysis
is generally conducted for the worst-case.

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
10
Big-O Notation: Formal Definition
The function f(n)= n2 + 100n + log10n + 1000 = O(n2) (read
"big-oh of n squared")

Definition: Let f(n) and g(n) be functions, where n ε Z is a


positive integer. We write f(n) = O(g(n)) if and only if there
exists a real number c and positive integer N satisfying, 0
< f(n) < cg(n) for all n > N. (And we say, "f of n is big-oh of g of
n.“)

This means that functions like:


– n2 + n,
–4n2 - n log n + 12, and
–n2/5 - 100n,
are all O(n2)

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
11
Useful Mathematic Summations

n(n + 1)
1 + 2 + 3 + .... + (n − 1) + n =
2

k =0 a + kd = 2 (2a + (n − 1)d )
n −1 n

n +1
( n −1) −1 a
a + a + a + a + .... + a
0 1 2 3
+a = n

a −1
n +1
( n −1) 2 −1
2 + 2 + 2 + 2 + .... + 2
0 1 2 3
+2 =
n

2 −1
Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
12
Examples: Determining Big-O

➢ Repetition
➢ Sequence
➢ Selection
➢ Logarithmic Time complexity

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
13
Primitive Counting
1. int findMin(int [] array){
2. int min = array[0];
3. for(int i=1; i<array.length; i++)
4. if(array[i]<min)
5. min=array[i]
6. return min

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
14
Repetition: Simple Loops
for (int i = 1; i <= n; i++){
executed k = k + 5;
n times } constant time

Time Complexity
T(n) = (a constant c) * n = cn = O(n)

Ignore multiplicative constants (e.g., “c”).

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
15
Repetition: Nested Loops
for (int i = 1; i <= n; i++) {
executed for (int j = 1; j <= n; j++){ inner loop
n times k = k + i + j; executed
} n times
}
constant time

Time Complexity
T(n) = (a constant c) * n * n = cn2 = O(n2)

Ignore multiplicative constants (e.g., “c”).

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
16
Repetition: Nested Loops
for (int i = 1; i <= n; i++) {
executed for (int j = 1; j <= i; j++) inner loop
n times { executed
k = k + i + j; i times
}
} constant time
Time Complexity
T(n) = c + 2c + 3c + 4c + … + nc = cn(n+1)/2 =
(c/2)n2 + (c/2)n = O(n2)
Ignore non-dominating terms

Ignore multiplicative constants


Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
17
Repetition: Nested Loops
for (int i = 1; i <= n; i++) {
executed for (int j = 1; j <= 20; j++) inner loop
n times { executed
k = k + i + j; 20 times
}
} constant time

Time Complexity
T(n) = 20 * c * n = O(n)

Ignore multiplicative constants (e.g., 20*c)

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
18
Sequence
for ( int j = 1; j <= 10; j++) {
executed k = k + 4;
10 times }
for (int i = 1; i <= n; i++) {
executed for (int j = 1; j <= 20; j++){ inner loop
n times k = k + i + j; executed
} 20 times
}

Time Complexity
T(n) = c *10 + 20 * c * n = O(n)

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
19
Selection
O(n)
if (list.contains(e)) {
System.out.println(e);
}
else
for (Object t: list) { Let n be
System.out.println(t); list.size().
} Executed
n times.

Time Complexity
T(n) = test time + worst-case (if, else)
= O(n) + O(n)
= O(n)

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
20
Constant Time
➢ The Big O notation estimates the execution time of an algorithm in
relation to the input size.
➢ If the time is not related to the input size, the algorithm is said to
take constant time with the notation O(1).
➢ For example, a method that retrieves an element at a given index in
an array takes constant time, because it does not grow as the size of
the array increases.

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
21
animation
Linear Search Animation
https://www.cs.usfca.edu/~galles/visualization/Search.html

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
22
animation
Binary Search Animation
https://www.cs.usfca.edu/~galles/visualization/Search.html

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
23
Logarithmic Complexity: Analyzing
Binary Search
➢ The binary search searches a key in a sorted array.
➢ Each iteration in the algorithm contains a fixed number of
operations, denoted by c.
➢ Let T(n) denote the time complexity for a binary search on a list of
n elements.
➢ Since binary search eliminates half of the input after two
comparisons,
n n n
T (n) = T ( ) +c = T ( 2 ) + c + c = ... = T ( k ) +ck = T (1) + clog n = c + c log n
2 2 2
= O(log n)

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
24
Logarithmic Time
➢ Ignoring constants and smaller terms, the complexity of the
binary search algorithm is O(log(n)).
➢ An algorithm with the O(log(n)) time complexity is called a
logarithmic algorithm.
➢ The base of the log is 2, but the base does not affect a
logarithmic growth rate, so it can be omitted.
➢ The logarithmic algorithm grows slowly as the problem size
increases.
– If you square the input size, you only double the time for the
algorithm.
Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
25
animation
Selection Sort Animation
https://www.cs.usfca.edu/~galles/visualization/Compariso
nSort.html

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
26
Analyzing Selection Sort
➢ The selection sort algorithm finds the smallest number in the list
and places it at index 0.
➢ It then finds the smallest number remaining and places it next to
index 1, and so on until the list contains only a single number.
➢ The number of comparisons is n-1 for the first iteration, n-2 for the
second iteration, and so on.
➢ Let T(n) denote the complexity for selection sort ,d denote the
constant time for each comparison, and c denote the total number
of other operations such as assignments and additional
comparisons in each iteration. So,
n2 n
T (n) = ( n − 1)d + c + (n − 2)d + c... + 2d + c + 1d + c = d − d + cn
2 2
➢ Ignoring constants and smaller terms, the complexity of the
selection sort algorithm is O(n2).

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
27
Quadratic Time
➢ An algorithm with the O(n2) time complexity is called a
quadratic algorithm.
➢ The quadratic algorithm grows quickly as the problem
size increases.
➢ If you double the input size, the time for the algorithm is
quadrupled.
➢ Algorithms with a nested loop are often quadratic.

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
28
animation
Insertion Sort Animation
https://www.cs.usfca.edu/~galles/visualization/Compariso
nSort.html

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
29
Analyzing Insertion Sort
➢ The insertion sort algorithm sorts a list of values by repeatedly
inserting a new element into a sorted partial array until the whole
array is sorted.
➢ At the kth iteration, to insert an element to a array of size k, it may
take k comparisons to find the insertion position, and k moves to
insert the element.
➢ Let T(n) denote the complexity for insertion sort, d denote the
constant time for each pair of comparison or data movement, and c
denote the total number of other operations such as assignments
and additional comparisons in each iteration. So,
T (n) = 2d + c + 2  2d + c... + 2  (n − 1)d + c = n 2 d − nd + cn
➢ Ignoring constants and smaller terms, the complexity of the
insertion sort algorithm is O(n2).

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
30
Towers of Hanoi
http://towersofhanoi.info/Animate.aspx

➢ The Towers of Hanoi, moves n disks from tower A to tower B with


the assistance of tower C recursively

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
31
Analyzing Towers of Hanoi
➢ The Towers of Hanoi, moves n disks from tower A to tower B
with the assistance of tower C recursively as follows:
– Move the first n – 1 disks from A to C with the assistance of tower B.
– Move disk n from A to B.
– Move n - 1 disks from C to B with the assistance of tower A.
➢ Let T(n) denote the complexity for the algorithm that moves
disks and c denote the constant time to move one disk, i.e., T(1)
is c. So,
T ( n) = T ( n − 1) + c + T ( n − 1) = 2T ( n − 1) + c
= 2( 2(T ( n − 2) + c ) + c )
.
.
.
= 2 n −1 T (1) + c 2 n − 2 + ... + c 2 + c
= c 2 n −1 + c 2 n − 2 + ... + c 2 + c = c ( 2 n − 1) = O ( 2 n )
Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
32
Case Study: Fibonacci Numbers
/** The method for finding the Fibonacci number */
public static long fib(long index) {
if (index == 0) // Base case
return 0;
else if (index == 1) // Base case
return 1;
else // Reduction and recursive calls
return fib(index - 1) + fib(index - 2);
}
Finonacci series: 0 1 1 2 3 5 8 13 21 34 55 89…
indices: 0 1 2 3 4 5 6 7 8 9 10 11

fib(0) = 0;
fib(1) = 1;
fib(index) = fib(index -1) + fib(index -2); index >=2
Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
33
Complexity for Recursive
Fibonacci Numbers
Since T (n) = T (n − 1) + T (n − 2) + c and T (n) = T (n − 1) + T (n − 2) + c
 2T (n − 1) + c
 2T (n − 2) + 2c
 2(2T (n − 2) + c) + c
= 2 2 T ( n − 2 − 2) + 2 2 c + 2c
= 2 T (n − 2) + 2c + c
2

... = 2 3 T ( n − 2 − 2 − 2) + 2 3 c + 2 2 c + 2c
= 2 n −1T (1) + 2 n − 2 c + ... + 2c + c 23 T (n − 2(3)) + 23 c + 2 2 c + 2c
= 2 n −1T (1) + (2 n − 2 + ... + 2 + 1)c ...
= 2 n −1T (1) + (2 n −1 − 1)c = 2 n / 2 T (0) + 2 n / 2 c + ... + 23 c + 2 2 c + 2c
= 2 n −1 c + (2 n − 2 + ... + 2 + 1)c = 2 n / 2 c + 2 n / 2 c + ... + 23 c + 2 2 c + 2c
= O(2 n ) = O(2 n )

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
34
Case Study: Non-recursive version
of Fibonacci Numbers
public static long fib(long n) {
long f0 = 0; // For fib(0) Obviously, the
long f1 = 1; // For fib(1)
long f2 = 1; // For fib(2)
complexity of this new
algorithm is O(n) .
if (n == 0)
return f0; This is a tremendous
else if (n == 1) improvement over the
return f1;
else if (n == 2) recursive algorithm.
return f2;

for (int i = 3; i <= n; i++) {


f0 = f1;
f1 = f2;
f2 = f0 + f1;
}

return f2;
}
Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
35
f0 f1 f2
Fibonacci series: 0 1 1 2 3 5 8 13 21 34 55 89…
indices: 0 1 2 3 4 5 6 7 8 9 10 11

f0 f1 f2
Fibonacci series: 0 1 1 2 3 5 8 13 21 34 55 89…
indices: 0 1 2 3 4 5 6 7 8 9 10 11

f0 f1 f2
Fibonacci series: 0 1 1 2 3 5 8 13 21 34 55 89…
indices: 0 1 2 3 4 5 6 7 8 9 10 11

f0 f1 f2
Fibonacci series: 0 1 1 2 3 5 8 13 21 34 55 89…
indices: 0 1 2 3 4 5 6 7 8 9 10 11

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
36
Dynamic Programming
➢ The algorithm for computing Fibonacci numbers presented in
the previous 2 slides uses an approach known as dynamic
programming.
➢ The most intuitive way to solve the Fibonacci sequence
problem is to solve sub-problems, then combine the solutions
of sub-problems to obtain an overall solution.
– This naturally leads to a recursive solution. However, it would be
inefficient to use recursion, because the sub-problems overlap.
➢ The key idea behind dynamic programming is to solve each
subprogram only once and storing the results for sub-problems
for later use to avoid redundant computing of the sub-
problems.

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
37
Common Recurrence Relations
Recurrence Relation Result Example

T ( n ) = T ( n / 2) + O (1) T ( n ) = O (log n ) Binary search, Euclid’s GCD

T ( n ) = T ( n − 1) + O (1) T (n ) = O (n ) Linear search

T ( n ) = 2T ( n / 2) + O (1) T (n) = O(n)


T ( n ) = 2T ( n / 2) + O ( n ) T ( n ) = O ( n log n ) Merge sort (Chapter 24)

T ( n ) = 2T ( n / 2) + O ( n log n ) T ( n ) = O ( n log 2 n )
T ( n ) = T ( n − 1) + O ( n ) T (n) = O(n 2 ) Selection sort, insertion sort

T ( n ) = 2T ( n − 1) + O (1) T (n ) = O (2 n ) Towers of Hanoi

T ( n ) = T ( n − 1) + T ( n − 2) + O (1) T (n) = O(2 n ) Recursive Fibonacci algorithm

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
38
Comparing Common Growth Functions
O(1)  O(log n)  O(n)  O(n log n)  O(n 2 )  O(n 3 )  ...  O(2 n )

O(1) Constant time


O (log n) Logarithmic time
O(n) Linear time
O(n log n) Log-linear time
O (n 2 ) Quadratic time
O (n 3 ) Cubic time
O (2n ) Exponential time
Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
39
Comparing Common Growth Functions
O(1)  O(log n)  O(n)  O(n log n)  O(n 2 )  O(n3 )  O(2n )

O(2n)
O(n2)

O(nlogn)

O(n)

O(logn)

O(1)

Liang, Introduction to Java Programming, Tenth Edition, (c) 2013 Pearson Education, Inc. All
rights reserved.
40

You might also like