0% found this document useful (0 votes)
2 views

Algorithms 2 Marks

The document provides a comprehensive overview of algorithms, including definitions of time complexity, recursion relations, and various algorithmic paradigms such as divide-and-conquer and greedy approaches. It discusses specific algorithms like insertion sort, Dijkstra's, and the Knuth-Morris-Pratt algorithm, as well as concepts like NP-hardness and graph theory. Additionally, it outlines the complexities of different sorting algorithms and the principles behind optimization techniques.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Algorithms 2 Marks

The document provides a comprehensive overview of algorithms, including definitions of time complexity, recursion relations, and various algorithmic paradigms such as divide-and-conquer and greedy approaches. It discusses specific algorithms like insertion sort, Dijkstra's, and the Knuth-Morris-Pratt algorithm, as well as concepts like NP-hardness and graph theory. Additionally, it outlines the complexities of different sorting algorithms and the principles behind optimization techniques.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Algorithms 2 Marks

Created @May 24, 2025 6:05 PM

1. Define time complexity of an algorithm.


Time complexity refers to the amount of time an algorithm takes to complete as
a function of the size of its input, usually denoted as n.

2. List the types of asymptotic notations in analyzing


complexity of algorithms.
The main types are:

Big O (O)

Omega (Ω)

Theta (Θ)

3. Define recursion relation.


A recurrence relation expresses the running time of a recursive algorithm in
terms of the running time on smaller inputs.

4. Discuss the time and space complexity of insertion sort.


Time Complexity: Best: O(n), Average/Worst: O(n²)

Space Complexity: O(1)

5. State how the running time of an algorithm is measured.


Running time is measured by counting the number of primitive operations or
steps executed as a function of input size.

6. Outline the significance of performing worst case analysis


of an algorithm.
Worst-case analysis ensures an upper bound on running time, giving
performance guarantees regardless of input.

Algorithms 2 Marks 1
7. Define the asymptotic notation “Big Oh” (O).
Big O represents the upper bound of an algorithm's running time, showing the
worst-case growth rate.

8. Define Knuth-Morris-Pratt algorithm.


KMP is a string-matching algorithm that avoids redundant comparisons by
preprocessing the pattern using a partial match table.

9. Define an algorithm.
An algorithm is a finite sequence of well-defined instructions to solve a specific
problem or perform a computation.

10. Define best, worst, average case time complexity.


Best: Minimum time required

Worst: Maximum time possible

Average: Expected time over all inputs

11. Outline a directed graph with an example.


A directed graph has edges with direction.

Example: A → B → C

Algorithms 2 Marks 2
12. What are the constraints to represent a transportation
network?
Constraints include capacity, cost, direction of routes, and demand at nodes.

13. What is bipartite graph?


A graph whose vertices can be divided into two disjoint sets such that no two
vertices in the same set are adjacent.

14. List the applications of graphs.


Used in networking, pathfinding, scheduling, social networks, and circuit
design.

15. What is a strongly connected graph? Give an example.


A directed graph is strongly connected if there’s a path from every vertex to
every other vertex.
Example: A↔B↔C↔A

16. Prove that the number of odd degree vertices in a


connected graph should be even.
By the Handshaking Lemma, sum of all vertex degrees is even, so the number
of vertices with odd degree must be even.

17. Define transitive closure of a directed graph.


The transitive closure of a graph contains an edge from vertex u to v if there is
a path from u to v in the original graph.

18. How to calculate the efficiency of Dijkstra’s algorithm.


Using a priority queue, its time complexity is O((V + E) log V), where V =
vertices, E = edges.

19. What do you mean by perfect matching in bipartite graph?


A perfect matching pairs all vertices of one set to the other set with no
unmatched vertices.

20. Define Floyd-Warshall algorithm.

Algorithms 2 Marks 3
An all-pairs shortest path algorithm using dynamic programming with time
complexity O(n³).

21. Outline divide-and-conquer algorithm design paradigm.


It involves dividing the problem into subproblems, solving them recursively, and
combining their solutions.

22. Define a multistage graph.


A directed graph divided into stages, with edges only between nodes in
successive stages.

23. What is meant by principle of optimality?


It states that an optimal solution to a problem contains optimal solutions to its
subproblems.

24. Write down the steps to build Huffman tree.


1. Create leaf nodes for each character.

2. Build a min-heap.

3. Extract two smallest nodes, merge them.

4. Repeat until one node remains—the root.

25. What kinds of general plan for divide-and-conquer


algorithms?
1. Divide

2. Conquer

3. Combine

26. State the elements of greedy approach.


Greedy-choice property

Optimal substructure

27. What are the differences between dynamic programming


and divide and conquer approaches?

Algorithms 2 Marks 4
DP stores and reuses subproblem solutions; divide-and-conquer solves
subproblems independently.

28. What is the time and space complexity of Merge sort?


Time: O(n log n)

Space: O(n)

29. Define the principle of optimality.


An optimal solution contains optimal solutions to its subproblems.

30. Write the difference between the Greedy method and


Dynamic programming
Greedy makes locally optimal choices; DP solves and stores all subproblems for
global optimum.

31. What is backtracking?


A problem-solving method that incrementally builds candidates and abandons
non-promising ones.

32. What are the factors that influence the efficiency of the
backtracking algorithm?
Problem constraints, pruning conditions, and branching factor.

33. Write short notes on graph colouring.


Graph colouring assigns colors to vertices such that no adjacent vertices share
the same color, used in scheduling.

34. What is travelling salesman problem? Give an example.


It seeks the shortest path visiting each city once and returning to start.

Example: A → B → C → D → A

35. With an example, define Hamiltonian circuit.


A Hamiltonian circuit visits each vertex once and returns to the start.
Example: A → B → C → D → A

Algorithms 2 Marks 5
36. Why is branch and bound approach found to be
appropriate for solving travelling salesman problem?
It prunes paths that can’t yield better solutions, reducing computation.

37. When can a node be terminated in the subset-sum


problem?
If the sum exceeds the target or if no remaining items can reach the sum.

38. What is a promising node and non-promising node in the


state-space tree?
A promising node can lead to a solution; a non-promising one cannot and is
pruned.

39. Compare backtracking and branch and bound.


Backtracking explores all feasible paths; branch and bound uses cost-based
pruning for optimization problems.

40. State the reason for terminating search path at the current
node in branch and bound algorithm.
To avoid exploring suboptimal paths that cannot yield better results than the
current best.

41. Outline the difference between a tractable problem and an


intractable problem.
Tractable problems can be solved in polynomial time; intractable problems
cannot.

42. State the quick sort.


Quick sort is a divide-and-conquer sorting algorithm with average time
complexity O(n log n).

43. Differentiate tractable and intractable problems.


Tractable: Solvable in polynomial time.

Intractable: Requires super-polynomial or exponential time.

44. Write an algorithm to find the kth smallest number.

Algorithms 2 Marks 6
Use Quickselect:

Choose pivot

Partition array

Recurse into appropriate subarray

45. How NP-Hard problems are different from NP-Complete?


NP-Hard may not be in NP (not verifiable in polynomial time); NP-Complete is
both in NP and NP-Hard.

46. When is a problem said to be NP-hard? Give an example.


If every NP problem reduces to it in polynomial time.
Example: TSP (decision version)

47. An NP-hard problem can be solved in deterministic


polynomial time, how?
Only if P = NP, which is an unsolved question in computer science.

48. Define NP completeness and NP hard.


NP-Complete: Problems in NP with NP-Hard difficulty.

NP-Hard: As hard as NP problems, but not necessarily in NP.

49. Define P and NP problems.


P: Solvable in polynomial time.

NP: Verifiable in polynomial time.

50. What do you mean by primality testing?


It determines whether a given number is prime, often used in cryptography.

Algorithms 2 Marks 7

You might also like