0% found this document useful (0 votes)
5 views

Unit-1_part_2

The document provides an introduction to algorithms, detailing their definition, qualities, and efficiency. It discusses pseudocode, various types of loops, and the importance of algorithm analysis, including time and space complexity. Additionally, it covers asymptotic notation and the differences between asymptotic and amortized analysis in evaluating algorithm performance.

Uploaded by

singhkirat4224
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Unit-1_part_2

The document provides an introduction to algorithms, detailing their definition, qualities, and efficiency. It discusses pseudocode, various types of loops, and the importance of algorithm analysis, including time and space complexity. Additionally, it covers asymptotic notation and the differences between asymptotic and amortized analysis in evaluating algorithm performance.

Uploaded by

singhkirat4224
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 34

Unit-1 Introduction

Analysis of Algorithms
Algorithm
• An algorithm is simply a finite set of steps to solve a problem. An algorithm presents the solution
to a problem as a well defined set of steps or instructions.
Qualities of a good algorithm
• Inputs and outputs should be defined precisely.
• Each steps in algorithm should be clear and unambiguous.
• Algorithm should be most effective among many different ways to solve a problem.
• An algorithm shouldn't have computer code. Instead, the algorithm should be written in such a
way that, it can be used in similar programming languages.
Features of an Algorithm:
• As we know that an algorithm takes some inputs, execute some finite number of steps and gives
an output. So, the certain step involved in the algorithm must be executable.
• It must generate some result.
Efficiency of Algorithms
• Two algorithms on two systems
• Algorithm A1 50 n lg n
• Algorithm A2 2 n2
A2 Super computer A1 P.C
108 ins/sec 106 ins /sec

For n = 106

Time taken by Supercomputer


= 2.(106)2/ 108 = 20,000 sec

Time taken by P .C.


= 50 . 106 lg 106 / 106 = 1,000 sec 3
Thus, by using a fast algorithm , the personal computer gives
results
20 times faster than the result given by supercomputer using a
slow algorithm.

Thus, a good algorithm is like a sharp knife, it does exactly


what it is supposed to do with a minimum amount of effort.

4
Algorithm vs Program

independent
Pseudocode
• Pseudocode is an English language like representation of the
code required for an algorithm.

• It is partly English, partly structured code.

• The English part provides a relaxed syntax that is easy to read.

• The code part consists of an extended version of the basic


algorithmic constructs-sequence, selection and iteration.

6
Sequence, Selection, Loop
• A sequence is a series of statements that do not alter the
execution path within an algorithm.
• Statements such as assign and add are sequence statements.
• A call to another algorithm is also considered a sequence
statement.
• Selection statements evaluate one or more alternatives. Paths
are followed based on its result.

7
• The typical selection statement is the two way selection
• if (condition) action 1 else action 2.
• The part of the loop are identified by indentation.
• Loop iterates a block of code.

8
Linear loop
1 l=1
2 Loop(I <= 1000)

1 application code
2 I=I + 1

The body of the loop is repeated 1000 times.

1 I=1
2 Loop (I <= n)
1 Application code
2 I=I+2
For this the code time is proportional to n
Logarithm Loops
Multiply loops Divide loops
1 I=1 1 I=n
2 Loop (I < n) 2 loop( l>= 1)
1 application code 1 application code
2 I = l*2 2 I=I/2

F(n) = [log n] F(n) = [log n]


Nested loop- linear logarithmic

• 1 I=1
• 2 loop(I <= n)
1 j=1
2 loop( j < = n)
1 application code
2 j = j *2
3 I=I+1
F(n ) = [n log n]
Dependent Quadratic
1 I=1
2 loop ( I < = n)
1 j=1
2 loop( j < = l)
1 application code
2 j=j+1
3 I=I + 1
no of iterations in the body of the inner loop is
1 + 2 + 3 + 4 +… + 10 = 55 I.e. n(n +1)/2
On an average = ( n+1/)2
Thus total no of iterations = n (n+1)/2
Quadratic
1 l=1
2 Loop (I < = n)
1 j=1
2 Loop( j < = l)
1 application code
2 j = j+1
3 I = l+1

F(n) = n2
Algorithm Analysis
Analysis vs. Design

• Analysis: Predict the cost of an algorithm in terms of


resources and performance

• Design: Design algorithms which minimize the cost .


Time Complexity and Space Complexity
• https://www.geeksforgeeks.org/time-complexity-and-
space-complexity/

16
Time Complexity
Real Time:
To analyze the real time complexity of a program we need to determine two
numbers for each statement in it:

• amount of time a single statement will take.

• No. of times it is executed.

• Product of these two, will be the total time taken by the statement.
First no. depends upon the machine and compiler used , hence the real time
complexity is machine dependent.
Frequency count
 To make analysis machine independent it is assumed that every
instruction takes the same constant amount of time for execution.
 Hence the determination of time complexity of a program is the
matter of summing the frequency counts of all the statements.
Asymptotic Notation
In mathematics, asymptotic analysis, also known as asymptotics, is a method of describing the
limiting behavior of a function. In computing, asymptotic analysis of an algorithm refers to defining
the mathematical boundation of its run-time performance based on the input size. For example,
the running time of one operation is computed as f(n), and maybe for another operation, it is
computed as g(n2). This means the first operation running time will increase linearly with the
increase in n and the running time of the second operation will increase exponentially
when n increases. Similarly, the running time of both operations will be nearly the same if n is
small in value.

Asymptotic notations are representation as:


Omega Notation (Ω)): Lower bound
Theta Notation (Θ): Tight bound
Big-Oh Notation(O): Upper bound

19
Usually, the analysis of an algorithm is done based on three cases:
Best Case
Average Case
Worst Case

20
Big-Oh

21
Omega

Omega notation doesn’t really help to analyze an algorithm because it is bogus to


evaluate an algorithm for the best cases of inputs.

22
Theta

23
Relations Between Q, O, W
Asymptotic Notation (cont.)

• Note: Even though it is correct to say “7n - 3 is O(n3)”, a better


statement is “7n - 3 is O(n)”, that is, one should make the
approximation as tight as possible

• Simple Rule: Drop lower order terms and constant factors


7n-3 is O(n)
8n2log n + 5n2 + n is O(n2log n)
Asymptotic Notation (Terminology)

• Special classes of algorithms:

constant O(1)
logarithmic: O(log n)
linear: O(n)
quadratic: O(n2)
polynomial: O(nk), k ≥ 1
exponential: O(an), a > 1
Categories of algorithm efficiency
Efficiency Big O
Constant O(1)
Logarithmic O(log n)
Linear O(n)
Linear logarithmic O(n log n)
Quadratic O(n2)
Polynomial O(nk)
Exponential O(cn)
Factorial O(n!)
Asymptotic Analysis of The
Running Time
• Use the Big-Oh notation to express the number of
primitive operations executed as a function of the input
size.
• Comparing the asymptotic running time
-an algorithm that runs in O(n) time is better than one that
runs in O(n2) time
-similarly, O(log n) is better than O(n)
-hierarchy of functions: log n << n << n2 << n3 << 2n
Amortized Analysis vs Asymptotic Analysis
• 1. Asymptotic Analysis
• Asymptotic analysis is used to analyze the growth rate of an algorithm's
running time or space requirements as the input size n approaches
infinity. It helps classify algorithms into Big-O, Big-Ω, or Big-Θ notations.
• Focus: Worst-case or average-case performance for a given input size.
• Goal: To estimate performance in terms of n for large inputs.
• Example:
• Suppose you want to find the maximum element in an array of size n.

29
•Time Complexity: O(n)(linear time), since we iterate through the array once.
•Asymptotic Behavior: As n grows, the runtime grows linearly.

Asymptotic analysis focuses on how the algorithm scales with input size, irrespective of specific
operations or varying costs over time.

30
• 2. Amortized Analysis
• Amortized analysis considers the average cost of operations over a sequence of operations, rather than
analyzing each operation individually. This is particularly useful when some operations are expensive, but the
expensive operations are infrequent and balanced out by many cheaper ones.
• Focus: Average cost of operations over the long run.
• Goal: To provide a realistic view of performance for algorithms with non-uniform costs.
• Example:
• Suppose you're working with a dynamic array (like Python's list), which doubles in size when full. Here's how
it works:
• Append an element to the array.
• If the array is full, allocate a new array (double the size), copy all elements, and then append the new
element.
• Operation Cost:
– Regular append: O(1).
– Resize (when full): O(n) because all elements are copied.
31
• Amortized Analysis for n Operations:
• Suppose the array starts empty and has size 1. Every time it doubles, the total number of
elements is copied, but this doubling happens rarely.
• Over n operations:
– Most operations are cheap O(1).
– Only a few resizing operations are costly O(n).
• Total Cost:
• The cost of all the resizing is: 1+2+4+⋯+n, which sums up to O(n).
• Dividing this O(n) cost over n operations, the amortized cost per operation is O(1).

• Amortized analysis gives the average cost per operation, accounting for occasional expensive
operations, while asymptotic analysis doesn't consider this balance—it would simply state O(n)
for the resizing step.
32
Key Differences:

Key Differences

Aspect Asymptotic Analysis Amortized Analysis

Growth rate as input size Average cost of a


Focus
grows sequence of operations
Single operation or single
Granularity Sequence of operations
input size
Dynamic arrays, hash
Sorting algorithms,
Examples table insertions, splay
searching algorithms
trees
Average-case time
Big-O/Big-omega
Result complexity for repeated
classification
operations

33
Real Life Example
• Asymptotic Analysis: Imagine you're timing how long it takes
to drive to a destination. If there’s a traffic jam, you record the
maximum time it takes for the worst-case scenario.

• Amortized Analysis: Imagine you take multiple trips to the


same destination. Sometimes you hit traffic, but most trips are
smooth. You average the time over all the trips to get a realistic
measure of your commute.

34

You might also like