0% found this document useful (0 votes)
0 views

leetcode_slides

This document is a comprehensive guide to LeetCode problems, organized by data structures and algorithms, providing approaches, solutions, and complexity analysis. It covers various problem categories including arrays, linked lists, trees, graphs, and dynamic programming, along with techniques like two pointers and sliding window. Additionally, it includes a general problem-solving framework and insights into performance analysis using Big O notation.

Uploaded by

shubham jha
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
0 views

leetcode_slides

This document is a comprehensive guide to LeetCode problems, organized by data structures and algorithms, providing approaches, solutions, and complexity analysis. It covers various problem categories including arrays, linked lists, trees, graphs, and dynamic programming, along with techniques like two pointers and sliding window. Additionally, it includes a general problem-solving framework and insights into performance analysis using Big O notation.

Uploaded by

shubham jha
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

LeetCode Problems and Solutions

A comprehensive guide to common coding interview problems

This presentation covers a wide range of LeetCode problems organized by data structures and algorithms, providing approaches,
solutions, and complexity analysis for each problem.

 Arrays  Linked Lists  Trees

Techniques for handling sequences of Problems involving sequential access data Binary trees, BSTs, traversal methods, and
elements including sliding window, two structures and pointer manipulation tree manipulation algorithms.
pointers, and more. techniques.

 Graphs  Heaps  Dynamic Programming

BFS, DFS, and other graph algorithms to Priority queue implementations and Breaking down complex problems into
solve network and connectivity problems. problems involving finding kth elements. simpler subproblems with overlapping
solutions.

Based on common interview problems from LeetCode.com


Approach to Problem Solving
Understanding the process for tackling LeetCode challenges

General Problem-Solving Framework

 1. Understand  2. Plan  3. Implement  4. Verify

Read the problem twice. Verbalize different approaches. Code your solution step by Test with examples. Analyze
Identify inputs, outputs, and Start with a brute force step. Focus on readability and edge cases. Evaluate time and
constraints. Ask clarifying solution, then optimize. Identify correctness before space complexity.
questions. patterns. optimization.

Big O Notation Refresher

Used to describe the performance of algorithms as input sizes grow:

 Constant O(1)  Logarithmic O(log n)  Linear O(n)


Operations independent of input size Divide-and-conquer algorithms Operations proportional to input size

 Linearithmic O(n log n)  Quadratic O(n²)  Exponential O(2ⁿ)


Most efficient sorting algorithms Nested loops, comparing each element Growth doubles with each input increase

Problem Categories We'll Cover

     
Arrays Linked Lists Stacks Queues Trees Graphs

2 / 20
Array Problems
Fundamental techniques for handling sequences of elements

 Contains Duplicate + Two Sum ? Missing Number

Find if array contains any duplicate values. Find two numbers that add up to target. Find the missing number in range [0,n].

Input: [1,2,3,1] Input: nums = [2,7,11,15], target = Input: [3,0,1]


Output: true 9 Output: 2
Output: [0,1]

Solution Approach: Solution Approach:


Use a hash set to track seen values while Solution Approach: Use the sum formula: Calculate expected sum
iterating through the array. Use a hash map to store visited numbers and from 0 to n and subtract the actual sum of the
their indices. For each number, check if (target - array.
 Time: O(n)  Space: O(n)
current) exists in map.
 Time: O(n)  Space: O(1)
 Time: O(n)  Space: O(n)

 Find Disappeared Numbers  Smaller Than Current  Min Time to Visit Points

Find all missing integers in range [1,n]. Count numbers smaller than current number. Find minimum time to visit all points in a 2D
plane.
Input: [4,3,2,7,8,2,3,1] Input: [8,1,2,2,3]
Output: [5,6] Output: [4,0,1,1,3] Input: [[1,1],[3,4],[-1,0]]
Output: 7

Solution Approach: Solution Approach:


Mark presence by using the array itself (negative Sort a copy of the array and use a hash map to Solution Approach:
marking), then find unmarked positions. track the count of smaller elements. Use Chebyshev distance (max of absolute
 Time: O(n)  Space: O(1)  Time: O(n log n)  Space: O(n) differences between x and y coordinates).

 Time: O(n)  Space: O(1)

Common Array Techniques:


 Two Pointers  Sliding Window # Hash Map/Set  Sorting

3 / 20
Array Techniques: Two Pointers & Sliding Window
Efficient approaches for solving array problems

 Two Pointers Technique


Using two pointer variables to solve problems with O(n) time complexity instead of O(n²).

 Best Time to Buy and Sell Stock  Squares of a Sorted Array = 3Sum

Find the maximum profit by buying low and selling Return sorted array of squared values. Find all triplets that sum to zero.
high.
Input: [-4,-1,0,3,10] Input: [-1,0,1,2,-1,-4]
Input: [7,1,5,3,6,4] Output: [0,1,9,16,100] Output: [[-1,-1,2],[-1,0,1]]
Output: 5
Use two pointers (left & right) to compare absolute values Sort array, then for each element, use two pointers to find
Use left pointer for buying day and right pointer to check and build sorted result array from the end. pairs that sum to target. Skip duplicates.
potential selling days, tracking maximum profit.  O(n)  O(n)  O(n²)  O(n)
 O(n)  O(1)

 Sliding Window Technique


Using a window that slides through array elements to process subarrays of fixed or variable size.

 Contains Duplicate II  Minimum Absolute Difference  Minimum Size Subarray Sum

Find if duplicates exist within k distance. Find pairs with smallest absolute difference. Find smallest subarray with sum ≥ target.

Input: nums=[1,2,3,1], k=3 Input: [4,2,1,3] Input: target=7, nums=[2,3,1,2,4,3]


Output: true Output: [[1,2],[2,3],[3,4]] Output: 2

Maintain a sliding window of size k using a set, adding new Sort array, then use a sliding window of size 2 to compare Use variable-size sliding window - expand right until sum ≥
elements and removing elements that fall outside window. adjacent elements and find minimum difference. target, then contract left to minimize length.
 O(n)  O(k)  O(n log n)  O(n)  O(n)  O(1)

Key Insights:

Two pointers often turn O(n²) solutions into O(n) Sliding window works great for substring/subarray problems
Pre-sorting can make both techniques more effective Hash sets/maps can optimize window operations
4 / 20
Bit Manipulation & Dynamic Programming
Low-level operations and optimization techniques 10110
 Bit Manipulation
Operations on individual bits in binary representations 01001
 Single Number

Find the element that appears once in an array where every other element

11010
Counting Bits

Count the number of 1's in the binary representation of each number

00101
appears twice. from 0 to n.

Input: [4,1,2,1,2] Input: n = 5


Output: 4 Output: [0,1,1,2,1,2]

10011
Solution Approach: Solution Approach:
Use XOR (^) operation. XOR of any number with itself is 0, and XOR with 0 Use dynamic programming where dp[i] = dp[i & (i-1)] + 1. Each value is 1 + the
returns the original number. count in the number with the rightmost 1 removed.

 Time: O(n)  Space: O(1)  Time: O(n)  Space: O(n)

Common Bit Operations

AND
1 &
1 &
(&)
1 = 1
0 = 0
OR (|)
1 | 1 = 1
1 | 0 = 1
01100 XOR
1 ^
1 ^
(^)
1 = 0
0 = 1

11001
0 & 0 = 0 0 | 0 = 0 0 ^ 0 = 0

 Dynamic Programming
Breaking down complex problems into simpler sub-problems

 Coin Change  Climbing Stairs

Find the fewest number of coins needed to make up a given amount. Count distinct ways to climb n stairs taking 1 or 2 steps at a time.

Input: coins = [1,2,5], amount = 11 Input: n = 3


Output: 3 (5 + 5 + 1) Output: 3 (1+1+1, 1+2, 2+1)

Solution Approach: Solution Approach:


Use bottom-up DP where dp[i] represents the fewest coins needed to make Use DP where dp[i] = dp[i-1] + dp[i-2]. This follows the Fibonacci sequence, as
amount i. For each coin, update dp[i] = min(dp[i], dp[i-coin] + 1). you can reach step i from either step i-1 or i-2.

 Time: O(n*m)  Space: O(n)  Time: O(n)  Space: O(1)

5 / 20
Backtracking Problems
Building solutions incrementally and backtracking when needed (●)
 What is Backtracking? / \
An algorithmic technique that builds a solution incrementally, abandoning a path when it determines the path cannot lead to a valid solution.

 Letter Case Permutation

Generate all possible permutations by changing the case of letters.


 Subsets
(●) (●)
Generate all possible subsets of a given set of distinct integers.

Input: "a1b2"
Output: ["a1b2","a1B2","A1b2","A1B2"]

Solution Approach:
Input: [1,2,3]

/ \ / \
Output: [[],[1],[2],[3],[1,2],[1,3],[2,3],[1,2,3]]

Solution Approach:

(●)(●)
Recursively explore each character, creating two branches for alphabetic For each element, make a decision to either include it in the current subset or
characters (uppercase and lowercase) and one branch for digits. exclude it, then recursively build from there.
 Time: O(2ⁿ)  Space: O(n·2ⁿ)  Time: O(n·2ⁿ)  Space: O(n·2ⁿ)

 Combinations

Find all possible k-sized combinations of numbers from 1 to n.

Input: n = 4, k = 2

(●)(●)
Permutations

Generate all possible permutations of a list of distinct integers.

Input: [1,2,3]
Output: [[1,2],[1,3],[1,4],[2,3],[2,4],[3,4]] Output: [[1,2,3],[1,3,2],[2,1,3],[2,3,1],[3,1,2],[3,2,1]]

Solution Approach: Solution Approach:


Recursively build combinations by selecting a starting number, then adding k-1 For each position, try all possible numbers that haven't been used yet. Use
more numbers greater than the starting one. swapping or a visited set to track used elements.
 Time: O(k·C(n,k))  Space: O(k·C(n,k))  Time: O(n·n!)  Space: O(n·n!)

Backtracking Pattern Template:

def backtrack(current_state, remaining_choices):


if is_solution(current_state):
add_to_solutions(current_state)
return

for choice in remaining_choices:


if is_valid(current_state, choice):
# Make the choice
apply(current_state, choice)

# Recursively solve with this choice


backtrack(current_state, remaining_choices)

# Undo the choice (backtrack)


undo(current_state, choice)

6 / 20
Linked List Problems
Sequential access data structures with dynamic memory allocation

 What is a Linked List?


A linear data structure where elements are stored in nodes that point to the next node in the sequence. Unlike arrays, linked lists allow for efficient
insertions and deletions.

1 2 3 4

 Middle of Linked List  Linked List Cycle

Find the middle node of a linked list. Determine if a linked list has a cycle.

Input: [1,2,3,4,5] Input: head = [3,2,0,-4], pos = 1


Output: [3,4,5] Output: true

Solution Approach: Solution Approach:


Use slow and fast pointers. Move slow pointer one step at a time and fast Use Floyd's Tortoise and Hare algorithm. If fast pointer catches up to slow
pointer two steps. When fast reaches the end, slow is at the middle. pointer, there is a cycle.
 Time: O(n)  Space: O(1)  Time: O(n)  Space: O(1)

 Reverse Linked List = Palindrome Linked List

Reverse a singly linked list. Check if a linked list is a palindrome.

Input: [1,2,3,4,5] Input: [1,2,2,1]


Output: [5,4,3,2,1] Output: true

Solution Approach: Solution Approach:


Iteratively change the direction of pointers using three pointers: previous, Find the middle, reverse the second half, then compare both halves to check for
current, and next. equality.

 Time: O(n)  Space: O(1)  Time: O(n)  Space: O(1)

Common Linked List Techniques:

Fast & Slow Pointers  Runner Technique  Reversing Techniques

Finding the middle, detecting cycles, or finding the Using two pointers at different speeds to process the Restructuring links to reverse parts or all of a linked
nth element from the end. list in a single pass. list.

7 / 20
Stack Problems
Last-In-First-Out (LIFO) data structures

 What is a Stack? TOP


A linear data structure that follows the Last-In-First-Out (LIFO) principle. Elements are added and removed from the same end, called
the "top". Item 1

Item 2

Item 3

 Min Stack Valid Parentheses

Design a stack that supports push, pop, top, and retrieving the minimum Check if the input string has valid parentheses ordering.
element, all in constant time.
Input: "({[]})"
push(x), pop(), top(), getMin() Output: true
All operations in O(1) time

Solution Approach:
Solution Approach: Use a stack to keep track of opening brackets. When a closing bracket is
Use a stack of pairs, where each pair contains the element value and the encountered, check if it matches the top element of the stack.
minimum value in the stack at that point.  Time: O(n)  Space: O(n)
 Time: O(1) per operation  Space: O(n)

 Evaluate Reverse Polish Notation  Stack Sorting

Evaluate the value of an arithmetic expression in Reverse Polish Notation. Sort a stack in ascending or descending order using only stack
operations.
Input: ["2","1","+","3","*"]
Output: 9 ((2 + 1) * 3) Input: [34, 3, 31, 98, 92, 23]
Output: [3, 23, 31, 34, 92, 98]

Solution Approach:
Use a stack to store numbers. When an operator is encountered, pop the top Solution Approach:
two elements, apply the operation, and push the result back onto the stack. Use a temporary stack. For each element in the original stack, pop elements
 Time: O(n)  Space: O(n) from the temporary stack that are greater, insert the element, then push back
the popped elements.

 Time: O(n²)  Space: O(n)

Stack Implementation Pattern:

Python Implementation
When to Use Stacks:
class Stack:
def __init__(self):  Implementing undo/redo functionality
self.items = []
 Expression evaluation and syntax parsing
def push(self, item):  Backtracking algorithms (DFS implementation)
self.items.append(item)
 Converting recursion to iteration
def pop(self):  Balancing symbol problems
if not self.is_empty():
return self.items.pop()

def peek(self):
if not self.is_empty():
return self.items[-1]

def is_empty(self):
return len(self.items) == 0

def size(self):
return len(self.items)

8 / 20
Queue Problems
First-In-First-Out (FIFO) data structures

 What is a Queue?
A linear data structure that follows the First-In-First-Out (FIFO) principle. Elements are added at the rear and removed from the front.

A B C D ?

Front  Rear 

 Implement Stack using Queues  Time Needed to Buy Tickets  Reverse First K Elements

Implement a LIFO stack using only FIFO queue Find time needed for a person at position k to buy Reverse the first k elements of a queue.
operations. tickets.
Input: q = [1,2,3,4,5], k = 3
Push, Pop, Top, Empty operations Input: tickets = [2,3,2], k = 2 Output: [3,2,1,4,5]
Using only queue operations Output: 6

Solution Approach:
Solution Approach: Solution Approach: Use a stack to reverse the first k elements: dequeue k
Use a queue to store elements. For push: add element to Calculate time by considering each person's ticket needs. elements and push onto stack, then pop from stack and
queue, then rotate the queue by moving all previous People before k need min(tickets[i], tickets[k]) time. People enqueue. Finally, dequeue and enqueue the remaining n-k
elements to the end to maintain stack order. after k need min(tickets[i], tickets[k]-1) time. elements.
 Time: O(n) push, O(1) others  Space: O(n)  Time: O(n)  Space: O(1)  Time: O(n)  Space: O(k)

Python Queue Implementation


When to Use Queues:
from collections import deque
 Breadth-First Search (BFS) algorithms
class Queue:  Task scheduling in operating systems
def __init__(self):
self.items = deque()
 Buffering for data streams

 Print job management


def enqueue(self, item):  Implementing caches and request handling
self.items.append(item)

def dequeue(self):
if not self.is_empty():
return self.items.popleft()

def peek(self):
if not self.is_empty():
return self.items[0]

def is_empty(self):
return len(self.items) == 0

def size(self):
return len(self.items)

9 / 20
Binary Tree Problems
Hierarchical data structures with recursive traversal patterns

What is a Binary Tree? 1


 A hierarchical data structure where each node has at most two children, typically referred to as the left
and right child.
2 3

4 5 6 7

Common Traversal Methods:

 Depth-First Search (DFS)  Breadth-First Search (BFS)  Implementation


Explores deep into a branch before moving to Explores all nodes at one level before moving to DFS typically uses recursion or a stack, while BFS
siblings. the next level. uses a queue.
• Preorder (Root, Left, Right) • Inorder (Left, Root, Right) • Level Order Traversal
• Postorder (Left, Right, Root)

 Average of Levels  Maximum Depth = Same Tree

Find the average value of nodes on each level. Find the maximum depth (height) of a binary tree. Check if two binary trees are identical.

Input: [3,9,20,null,null,15,7] Input: [3,9,20,null,null,15,7] Input: p=[1,2,3], q=[1,2,3]


Output: [3, 14.5, 11] Output: 3 Output: true

Use BFS to process nodes level by level, calculating the Recursively find max depth of left and right subtrees, add 1 Recursively compare corresponding nodes of both trees.
average at each level. for current level.  O(n)  O(h)
 O(n)  O(n)  O(n)  O(h)

 Diameter of Binary Tree  Invert Binary Tree  Lowest Common Ancestor

Find the length of the longest path between any two Mirror a binary tree by swapping left and right Find the lowest common ancestor of two nodes.
nodes. children.
Input: root=[3,5,1,6,2,0,8], p=5, q=1
Input: [1,2,3,4,5] Input: [4,2,7,1,3,6,9] Output: 3
Output: 3 Output: [4,7,2,9,6,3,1]
Recursively search for p and q. If found in different subtrees,
At each node, calculate diameter as sum of left and right Recursively swap left and right children at each node. current node is LCA.
heights. Track maximum diameter.  O(n)  O(h)  O(n)  O(h)
 O(n)  O(h)

10 / 20
Binary Search Tree Problems
Ordered binary trees with efficient search operations

8
What is a Binary Search Tree?
A special type of binary tree where for each node:
 All nodes in left subtree have values < node's value 3 10
All nodes in right subtree have values > node's value
This ordering property enables efficient O(log n) search, insert, and delete operations.
1 6 14

BST Property:
1 < 3 < 6 < 8 < 10 < 14
Inorder traversal gives sorted output

 Search in a BST + Insert into a BST  Sorted Array to BST

Find a node with given value in a BST. Insert a new node while maintaining BST properties. Create a height-balanced BST from sorted array.

Input: root=[4,2,7,1,3], val=2 Input: root=[4,2,7,1,3], val=5 Input: [-10,-3,0,5,9]


Output: [2,1,3] Output: [4,2,7,1,3,5] Output: [0,-3,9,-10,null,5]

Use BST property: If target < node, go left; if target > node, Traverse to find insertion point: If val < node.val, go left; if Recursively use the middle element as root, left half as left
go right; otherwise found. val > node.val, go right; until null found. subtree, right half as right subtree.
 O(h)  O(1)  O(h)  O(1)  O(n)  O(log n)

 Two Sum IV - BST  Delete Node in a BST  Kth Smallest Element

Find if two nodes sum to the target value. Delete a node while maintaining BST properties. Find the kth smallest element in a BST.

Input: root=[5,3,6,2,4,null,7], k=9 Input: root=[5,3,6,2,4,null,7], key=3 Input: root=[3,1,4,null,2], k=1


Output: true (5+4=9) Output: [5,4,6,2,null,null,7] Output: 1

Use a hash set to store visited values while traversing. For Three cases: 1) No children - remove node; 2) One child - Perform inorder traversal (which visits nodes in ascending
each node, check if (k - node.val) exists in set. replace with child; 3) Two children - replace with successor. order) and stop at the kth node.
 O(n)  O(n)  O(h)  O(1)  O(h+k)  O(h)

BST Key Advantages:

 Efficient Search  Ordered Traversal


O(log n) search, insert, delete vs O(n) for unsorted trees Inorder traversal gives elements in sorted order

 Range Queries  Self-Balancing Variants


Efficiently find all values within a range AVL trees and Red-Black trees maintain balance

11 / 20
Heap Problems
Priority queue implementations for efficient element ordering

9
What is a Heap?
A specialized tree-based data structure that satisfies the heap property:
7 8
 Max Heap: Parent node is greater than or equal to its children
Min Heap: Parent node is less than or equal to its children
5 6 3 4
Efficiently supports finding and removing the maximum/minimum element.

Max Heap Example

Common Heap Operations:


+ Insert  Peek  Extract
Add to end, then bubble up - O(log n) View top element (root) - O(1) Remove root, replace with last element, sink down - O(log n)

 Kth Largest Element in an Array  K Closest Points to Origin

Find the kth largest element in an unsorted array. Find the k closest points to the origin (0,0) in a 2D plane.

Input: [3,2,1,5,6,4], k=2 Input: [[1,3],[-2,2]], k=1


Output: 5 Output: [[-2,2]]

Solution Approach: Solution Approach:


Use a min heap of size k. For each element, push to heap and if size exceeds k, Use a max heap of size k, sorted by distance (x² + y²). For each point, add to
pop the smallest. After processing all elements, the heap root is the kth largest. heap and if size exceeds k, pop the farthest point.

 Time: O(n log k)  Space: O(k)  Time: O(n log k)  Space: O(k)

 Top K Frequent Elements  Task Scheduler

Find the k most frequent elements in an array. Schedule tasks with cooling time between same tasks.

Input: [1,1,1,2,2,3], k=2 Input: tasks=["A","A","A","B","B","B"], n=2


Output: [1,2] Output: 8 (A→B→idle→A→B→idle→A→B)

Solution Approach: Solution Approach:


1) Count frequency of each element 2) Use a min heap to keep track of k Use a max heap to keep track of task frequencies. Process the most frequent
elements with highest frequency, ordered by frequency. tasks first, then reinsert with reduced frequency after cooling period.

 Time: O(n log k)  Space: O(n)  Time: O(n log n)  Space: O(n)

 Heap Implementation in Python:

# Python's heapq module implements a min heap


import heapq

# Create a heap
heap = []

# Add elements (heappush)


heapq.heappush(heap, 3)
heapq.heappush(heap, 1)
heapq.heappush(heap, 4) # heap is now [1, 3, 4]

# Get smallest item without removing (peek)


smallest = heap[0] # returns 1

# Remove and return smallest item (heappop)


smallest = heapq.heappop(heap) # returns 1, heap is now [3, 4]

# For max heap, negate values


max_heap = []
heapq.heappush(max_heap, -3)
heapq.heappush(max_heap, -1) # max_heap is effectively [3, 1]

12 / 20
Graph Problems
Solving complex network and connectivity challenges

What is a Graph?
A collection of nodes (vertices) connected by edges. Graphs can be: A
 Directed/Undirected: Edges have direction or not
Weighted/Unweighted: Edges have values or not
B C
Connected/Disconnected: All nodes can reach each other or not
Cyclic/Acyclic: Contains cycles or not
D E

Undirected Graph Example

Common Graph Representations:

 Adjacency Matrix  Adjacency List

A 2D array where matrix[i][j] indicates if there is an edge from node i to node j. A collection of lists/arrays where the index represents a vertex and the values are its
adjacent vertices.
A B C D E
A [0, 1, 1, 0, 0] A: [B, C]
B [1, 0, 0, 1, 0] B: [A, D]
C [1, 0, 0, 0, 1] C: [A, E]
D [0, 1, 0, 0, 1] D: [B, E]
E [0, 0, 1, 1, 0] E: [C, D]

+ O(1) edge lookup  O(V²) space + O(V+E) space efficient  O(degree) edge lookup

Graph Traversal Techniques:

 Breadth-First Search (BFS)  Depth-First Search (DFS)

Uses a queue to explore neighbors before moving to next level Uses a stack (or recursion) to explore as far as possible along a branch
Good for finding shortest path in unweighted graphs Good for topological sorting, cycle detection
Time: O(V+E), Space: O(V) Time: O(V+E), Space: O(V)

Common Graph Problems:

 Clone Graph  Cheapest Flights Within K Stops  Course Schedule

Create a deep copy of an undirected graph. Find cheapest price from src to dst with at most K Determine if it's possible to finish all courses given
stops. prerequisites.
Use BFS or DFS to traverse the graph, maintaining a
hashmap of original nodes to their clones. Use modified Bellman-Ford algorithm to relax edges k+1 Use DFS to detect cycles in the directed graph. If a cycle
 O(V+E)  O(V) times, tracking min cost at each iteration. exists, it's impossible to complete all courses.
 O(K*E)  O(V)  O(V+E)  O(V+E)

13 / 20
Advanced Graph Algorithms
Specialized techniques for solving complex graph problems

 Shortest Path Algorithms  Minimum Spanning Tree

Dijkstra's Algorithm Kruskal's Algorithm


Finds shortest path from source to all vertices in a weighted graph with non- Builds MST by adding edges in increasing weight order, avoiding cycles using
negative weights. Union-Find.
 Time: O((V+E)log V)  Space: O(V)  Time: O(E log E)  Space: O(V)

Bellman-Ford Algorithm Prim's Algorithm


Works with negative weights and can detect negative cycles. Used in the "Cheapest Builds MST by growing a single tree from a starting vertex, always adding the lowest
Flights" problem. weight edge.
 Time: O(V·E)  Space: O(V)  Time: O(E log V)  Space: O(V)

 When to Use:
4 3
MST algorithms are useful for network design problems (connecting cities, designing
A2 1
B 5
C circuit layouts) with minimal total weight.

D E

 Topological Sort  Strongly Connected Components

Linear ordering of vertices such that for every directed edge (u,v), u Maximal subgraphs where every vertex is reachable from every other
comes before v in the ordering. vertex within the subgraph.

Kosaraju's Algorithm
Applications:
Finds all SCCs in a directed graph using two passes of DFS.
Task scheduling with dependencies (as in Course Schedule problem)
Build systems determining compilation order
 Time: O(V+E)  Space: O(V)

Dependency resolution in package managers


When to Use:
Finding cycles in directed graphs
# Kahn's Algorithm (BFS approach)
Analyzing connectivity in networks
def topological_sort(graph):
in_degree = {node: 0 for node in graph} Solving 2-SAT problems
for node in graph: Decomposing complex problems by breaking them into strongly connected
for neighbor in graph[node]: subcomponents
in_degree[neighbor] += 1

queue = [node for node in in_degree if in_degree[node] == 0]


result = []

while queue:
node = queue.pop(0)
result.append(node)
for neighbor in graph[node]:
in_degree[neighbor] -= 1
if in_degree[neighbor] == 0:
queue.append(neighbor)

return result if len(result) == len(graph) else []

LeetCode Problems Using Advanced Graph Algorithms:

 Cheapest Flights Within K Stops  Course Schedule  Network Delay Time


Uses a modified Bellman-Ford to find shortest paths with Uses topological sort to detect if courses can be completed Uses Dijkstra's algorithm to find the time it takes for all nodes
constraints. without circular dependencies. to receive a signal.

14 / 20
Common Algorithmic Patterns and Strategies
Recognizing patterns to improve problem-solving efficiency

Key Problem-Solving Patterns More Advanced Patterns

 Sliding Window  Tree Traversal Strategies


Track a subarray or substring with a variable-size window. Useful for Choose between DFS (pre/in/post-order) and BFS (level order)
finding subarrays with specific properties. based on the problem requirements.
Examples: Minimum Size Subarray Sum, Longest Substring Without Repeating Examples: Level Order Traversal, Validate Binary Search Tree
Characters

 Backtracking
 Two Pointers
Build solutions incrementally, abandoning paths that cannot lead to
Use two pointers to iterate through data structures from opposite valid solutions.
ends or at different speeds. Examples: N-Queens, Permutations, Subsets, Letter Combinations
Examples: Two Sum II, Reverse Linked List, Linked List Cycle

 Heap / Priority Queue


 Dynamic Programming
Maintain a collection where you can efficiently find/remove the
Break down problems into overlapping subproblems and store min/max element.
solutions to avoid recalculation. Examples: Kth Largest Element, Merge K Sorted Lists, Task Scheduler
Examples: Coin Change, Climbing Stairs, Maximum Subarray

 Union-Find (Disjoint Set)


 Binary Search
Track connected components in a graph, with efficient operations to
Divide search space in half repeatedly. Works on sorted arrays or join sets and check connectivity.
spaces with clear boundaries. Examples: Number of Islands II, Graph Valid Tree, Redundant Connection
Examples: Search in Rotated Sorted Array, Find First and Last Position

Strategic Problem-Solving Workflow

1 2 3 4 5

Identify Pattern Consider Edge Cases Start Simple Trace Examples Optimize & Refine
Recognize which common pattern Empty input, single element, Begin with brute force, then Walk through small examples by Improve time/space complexity
fits the problem duplicates, etc. optimize hand with patterns

15 / 20
Optimizing Your Problem Solving Approach
Techniques to improve efficiency and effectiveness

Before You Start Coding During Implementation

 Understand the problem fully  Start with a simple approach


Read the problem statement twice. Identify input/output formats, Begin with a working brute-force solution, then optimize. Having a
constraints, and edge cases. Don't rush to code before understanding. functional solution provides confidence and a fallback.

 Use pen and paper  Test as you go


Draw out the problem, trace examples, visualize the data structure. This Don't wait until the end to test your code. Test key functions individually
activates different parts of your brain and can lead to insights. with simple cases to catch bugs early.

 Break it down  Optimize iteratively


Divide complex problems into smaller, manageable sub-problems. Solve Look for redundant operations, unnecessary data storage, or better
each part separately, then combine the solutions. algorithms after you have a working solution.

Common Optimization Techniques

# Use hash structures  Memoization  Precomputation

Trading space for time with maps/sets for O(1) Cache results of expensive function calls to avoid Calculate values once upfront instead of repeatedly.
lookups. recalculation.
prefix_sum = [0]
seen = set() # O(1) lookups @functools.lru_cache for x in nums:
if x in seen: # vs. O(n) list search def fib(n): # Only calculates once prefix_sum.append(prefix_sum[-1] + x)

16 / 20
Effective LeetCode Practice Strategies
Maximizing learning and interview preparation

Structured Practice Approach Problem Solving Process

 Topic-Based Learning  Struggle First


Focus on one data structure or algorithm pattern at a time. Complete 5-10 Try to solve each problem for at least 20-30 minutes before looking at
problems on the same topic before moving on to build expertise. solutions. The struggle phase is where deep learning occurs, even when you
don't reach the solution.

 Timed Practice
Set a timer for 20-45 minutes when solving problems. This simulates  Study Solutions Thoroughly
interview conditions and improves your ability to solve problems under Don't just glance at the solution - understand it deeply. Read multiple
pressure. solutions, understand the trade-offs, and identify the patterns being applied.

 Spaced Repetition  Implement From Memory


Revisit problems you've solved previously after 1 week, 2 weeks, and 1 After understanding a solution, close it and implement it again from
month. This reinforces neural pathways and improves long-term retention. memory. This ensures you truly grasp the approach rather than just copying
code.

Tracking Your Progress

Maintain a Log Categorize Problems Set Weekly Goals Study Groups


 Track problems solved, time taken,  Label by difficulty, pattern, and  Aim for consistent practice with  Discuss solutions and explain your
and patterns used company tags measurable targets approach to others

17 / 20

You might also like