DAA-U-1
DAA-U-1
www.ourcreativeinfo.in
such as finding the optimal solution to a system of linear equations or
finding the shortest path in a graph.
and effectively.
2. They help to automate processes and make them more reliable, faster,
and easier to perform.
3. Algorithms also enable computers to perform tasks that would be
www.ourcreativeinfo.in
What are the Characteristics of an Algorithm?
Well-
Defined
Inputs
Characteristics
of an
Algorithm
Feasible Finiteness
Language
Independent
www.ourcreativeinfo.in
produce at least 1 output.
www.ourcreativeinfo.in
basic, simple, and feasible operations so that one can trace it out by
using just paper and pencil.
Properties of Algorithm:
• It should terminate after a finite time.
• It should produce at least one output.
• It should take zero or more input.
• It should be deterministic means giving the same output for the same
input case.
• Every step in the algorithm must be effective i.e. every step should do
some work.
Types of Algorithms:
• Brute Force Algorithm
• Recursive Algorithm
• Backtracking Algorithm
• Searching Algorithm:
• Divide and Conquer Algorithm
Advantages of Algorithms:
• It is easy to understand.
• An algorithm is a step-wise representation of a solution to a given
problem.
• In an Algorithm the problem is broken down into smaller pieces or
steps hence, it is easier for the programmer to convert it into an
actualprogram.
Disadvantages of Algorithms:
www.ourcreativeinfo.in
Fundamentals of Algorithmic Problem Solving:
A sequence of steps involved in designing and analyzing an algorithm is shown in
the figure below.
www.ourcreativeinfo.in
Analyzing an Algorithm
a. For an algorithm the most important is efficiency. In fact, there are two kinds
2. Coding an Algorithm
c. Standard tricks like computing a loop’s invariant (an expression that does not
www.ourcreativeinfo.in
orders of magnitude.
Complexity:
• The Performance of a computer program is the amount of the
memory and time needed to a run a program
• Time efficiency, also called time complexity, indicates how quickly an algorithm
runs.
www.ourcreativeinfo.in
• The space complexity of an algorithm is the amount of space (memory) it needs
till completion.
• Instruction space
• Data Space
• Environment stack
• S = S(fix) + S(variable)
Algorithm(A,N)
S=S+A[i];- n
return s;------------------------------- 1
f(n)=2n+3
Time Complexity:O(n)
www.ourcreativeinfo.in
Space Complexity: A ---- →n
n---- →1
s ---- →1
i ---- →1
n+3
S(n)=n+3
Space Complexity(n)
Example 02:
F(n)=2 n2+2n+1
www.ourcreativeinfo.in
Time Complexity=O(n2)
f(n) 3n2+3
Space Complexity:=O(n2)
Analysis Framework
For example, it will be the size of the list for problems of sorting, searching, finding
the list's smallest element, and most other problems dealing with lists.
There are situations, of course, where the choice of a parameter indicating an input
size does matter.
➢ One possible approach is to count the number of times each of the algorithm's
operations is executed. This approach is both difficult and unnecessary.
Time Complexity:
The time complexity of an algorithm quantifies the amount of time taken by an
algorithm to run as a function of the length of the input. Note that the time to run
is a function of the length of the input and not the actual execution time of the
machine on which the algorithm is running on.
Definition
The valid algorithm takes a finite amount of time for execution. The time
required by the algorithm to solve given problem is called time complexity of
the algorithm. Time complexity is very useful measure in algorithm analysis.
www.ourcreativeinfo.in
Space complexity:
We often speak of extra memory needed, not counting the memory needed to
store the input itself. Again, we use natural (but fixed-length) units to measure
this.
We can use bytes, but it's easier to use, say, the number of integers used, the
number of fixed-sized structures, etc.
In the end, the function we come up with will be independent of the actual
number of bytes needed to represent the unit.
Space complexity is sometimes ignored because the space used is minimal and/or
obvious, however sometimes it becomes as important issue as time complexity
Variable a is an array, n is the size of the array and item is the item we are looking for
in the array. When the item we are looking for is in the very first position of the array,
it will return the index immediately.
The for loop runs only once. So the complexity, in this case, will be O(1). This is the
called the best case.
Consider another example of insertion sort. Insertion sort sorts the items in the input
array in an ascending (or descending) order. It maintains the sorted and un-sorted
parts in an array. It takes the items from the un-sorted part and inserts into the sorted
part in its appropriate position. The figure below shows one snapshot of the insertion
operation.
In the figure, items [1, 4, 7, 11, 53] are already sorted and now we want to place 33 in
its appropriate place.
The item to be inserted are compared with the items from right to left one-by-one
until we found an item that is smaller than the item we are trying to insert.
We compare 33 with 53 since 53 is bigger we move one position to the left and
compare 33 with 11. Since 11 is smaller than 33, we place 33 just after 11 and move
53 one step to the right.
Here we did 2 comparisons. It the item was 55 instead of 33, we would have
www.ourcreativeinfo.in
Performed only one comparison.
That means, if the array is already sorted then only one comparison is necessary to
place each item to its appropriate place and one scan of the array would sort it. The
code for insertion operation is given below.
1
2 void sort(int a, int n) {
3 int i, j;
4 for (i = 0; i < n; i++) {
5 j = i-1;
6 key = a[i];
7 while (j >= 0 && a[j] > key)
8 {
9 a[j+1] = a[j];
10 j = j-1;
11 }
12 a[j+1] = key;
13 }
}
When items are already sorted, then the while loop executes only once for each item
There are total n items, so the running time would be O(n)
So the best case running time of insertion sort is O(n)
The best case gives us a lower bound on the running time for any input.
If the best case of the algorithm is O(n) then we know that for any input the program
needs at least O(n) time to run. In reality, we rarely need the best case for our
algorithm. We never design an algorithm based on the best case scenario.
In real life, most of the time we do the worst case analysis of an algorithm. Worst
case running time is the longest running time for any input of size n.
In the linear search, the worst case happens when the item we are searching is in the
last position of the array or the item is not in the array.
In both the cases, we need to go through all n items in the array. The worst case
runtime is, therefore, O(n). Worst case performance is more important than the best
case performance in case of linear search because of the following reasons.
1. The item we are searching is rarely in the first position. If the array has 1000
items from 1 to 1000. If we randomly search the item from 1 to 1000, there is
0.001 percent chance that the item will be in the first position.
2. Most of the time the item is not in the array (or database in general). In which
case it takes the worst case running time to run.
www.ourcreativeinfo.in
Similarly, in insertion sort, the worst case scenario occurs when the items are reverse
sorted. The number of comparisons in the worst case will be in the order of n2 2 and
hence the running time is O(n2)
In the case of insertion sort, when we try to insert a new item to its appropriate
position, we compare the new item with half of the sorted item on average.
The complexity is still in the order of n2 2 which is the worst-case running time.
It is usually harder to analyze the average behavior of an algorithm than to analyze its
behavior in the worst case.
This is because it may not be apparent what constitutes an “average” input for a
particular problem.
Order of growth:
It is described by the highest degree term of the formula for running time. (Drop
lower-order terms. Ignore the constant coefficient in the leading term.)
Example: We found out that for insertion sort the worst-case running time is of the
form an2 + bn + c.
We usually consider one algorithm to be more efficient than another if its worst-
www.ourcreativeinfo.in
case running time has a smaller order of growth.
www.ourcreativeinfo.in