UNIT - I Introduction
UNIT - I Introduction
UNIT - I
Introduction
UNIT – I: Introduction
❖ Introduction: The Role of Algorithms in Computing,
Algorithms as a technology, Analyzing algorithms,
Designing algorithms, Growth of Functions, Asymptotic
notation, Standard notations and common functions.
❖ Fundamental Algorithms: Exchanging the values of two
variables, Counting, Summation of a set of numbers,
Factorial Computation, Generating of the Fibonacci
sequence, Reversing the digits of an integer, Character to
number conversion.
UNIT – I: Introduction
❖ Introduction: The Role of Algorithms in Computing,
Algorithms as a technology, Analyzing algorithms,
Designing algorithms, Growth of Functions, Asymptotic
notation, Standard notations and common functions.
❖ Fundamental Algorithms: Exchanging the values of two
variables, Counting, Summation of a set of numbers,
Factorial Computation, Generating of the Fibonacci
sequence, Reversing the digits of an integer, Character to
number conversion.
What is an Algorithm?
❖ An algorithm is a process or a set of rules required to perform
calculations or some other problem-solving operations especially by
a computer.
Computer A Computer B
Running Insertion Sort to Sort an array Running Merge Sort to Sort an array
Running Time of the Insertion sort = n2 Running Time of the Merge sort = n logn
● Brute Force Algorithm: This is the most basic and simplest type of
algorithm. A Brute Force Algorithm is the straightforward approach to a
problem.
● It is just like iterating every possibility available to solve that problem.
● This type of algorithms are moreover used to locate the ideal or best
solution as it checks all the potential solutions.
● Example: If there is a lock of 4-digit PIN. The digits to be chosen from 0-9
then the brute force will be trying all possible combinations one by one like
0001, 0002, 0003, 0004, and so on until we get the right PIN.
● In the worst case, it will take 10,000 tries to find the right combination.
Designing Algorithms
● Recursive Algorithms: This type of algorithm is based on recursion.
● In recursion, a problem is solved by breaking it into sub-problems of the
same type and calling own self again and again until the problem is solved
with the help of a base condition.
● Example: Some common problems that can be solved through the Greedy
Algorithm are Prim's Algorithm, Kruskal's Algorithm, Huffman Coding, etc.
Designing Algorithms
● Dynamic Programming: A dynamic programming algorithm works by
remembering the results of a previous run and using them to arrive at new
results.
● Such an algorithm solves complex problems by breaking it into multiple
simple subproblems, solving them one by one and storing them for future
reference and use.
● Priori Analysis
● "Priori" means "before". Hence Priori analysis means checking the
algorithm before its implementation.
● In this, the algorithm is checked when it is written in the form of
theoretical steps. This is done usually by the algorithm designer.
● Algorithm complexity is determined in this phase.
● In this we obtain a function which bounds the algorithms computing
time. Suppose there is some statement and we wish to determine the
total time that statement will spend for the execution, given some initial
state of input data.
Priori Analysis and Posteriori Analysis
● This requires essentially two items of information.
● Where n indicates the instance characteristics and ta, ts, tm- - - denote
the time needed for an addition, subtraction, multiplication, and so on.
● ADD, SUB, MUL- - - represent the functions and they are performed
when the code for the program is used on an instance characteristic 'n'.
Time Complexity
● Obtaining such an exact formula is itself an impossible task, since the
time needed for an addition, subtraction, multiplication and so on, often
depends on the numbers being added, subtracted, multiplied and so on.
● The value of t(n) for any given ‘n' can be obtained only experimentally.
Code Description
A = a * b; This code takes 1 unit of time
n linear
n log n linearithmic
n^2 quadratic
n^3 cubic
2^n exponential
n! Factorial
The order of growth is
0(1) < O(log n) < O(n) < O(n log n) < O(n^2) < 0(n^3) < 0(2^n) < o(n!)
Orders of Growth
● To know how the various functions grow with "0", it is advised to study the
following table.
Orders of Growth
● It is evident from the above table that the function 2^n grows very rapidly
with n.
● In fact, if an algorithm needs 2^n steps for execution, then when n = 32,
the number of steps needed is approximately 4.2 x 10^9
● Therefore we may conclude that the utility of algorithms with exponential
complexity is limited to small n.