0% found this document useful (0 votes)
19 views

The Greedy Method 1

The document describes the greedy method algorithm design paradigm. It works by making locally optimal choices at each step to arrive at a global optimal solution. It is best applied to problems with the greedy-choice property, where a globally optimal solution can be found through a series of local improvements. Examples of problems that can be solved with greedy algorithms include fractional knapsack, task scheduling, and minimum spanning trees. The greedy choice is to always select the item or task with the highest value at each step.
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views

The Greedy Method 1

The document describes the greedy method algorithm design paradigm. It works by making locally optimal choices at each step to arrive at a global optimal solution. It is best applied to problems with the greedy-choice property, where a globally optimal solution can be found through a series of local improvements. Examples of problems that can be solved with greedy algorithms include fractional knapsack, task scheduling, and minimum spanning trees. The greedy choice is to always select the item or task with the highest value at each step.
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 10

The Greedy Method

The Greedy Method 1


Outline and Reading

The Greedy Method Technique (§5.1)


Fractional Knapsack Problem (§5.1.1)
Task Scheduling (§5.1.2)
Minimum Spanning Trees (§7.3) [future lecture]

The Greedy Method 2


The Greedy Method
Technique
The greedy method is a general algorithm design
paradigm, built on the following elements:
 configurations: different choices, collections, or values to find
 objective function: a score assigned to configurations, which
we want to either maximize or minimize
It works best when applied to problems with the
greedy-choice property:
 a globally-optimal solution can always be found by a series of
local improvements from a starting configuration.

The Greedy Method 3


Making Change
Problem: Accept n dollars, to return a collection of coins with a total
value of n dollars.
Configuration: A collection of coins with a total value of n
Objective function: Minimize number of coins returned.
Greedy solution: Always return the largest coin you can
Example 1: Coins are valued $.32, $.08, $.01
 Has the greedy-choice property, since no amount over $.32 can be
made with a minimum number of coins by omitting a $.32 coin
(similarly for amounts over $.08, but under $.32).
 For a certain amount (y) over a coin dimension (x), if we can reach it
using coins (<x) with a total n coins, then we can also use coin x with ≤
n coins.
Example 2: Coins are valued $.30, $.20, $.05, $.01
 Does not have greedy-choice property, since $.40 is best made with
two $.20’s, but the greedy solution will pick three coins (which ones?)

The Greedy Method 4


The Fractional Knapsack
Problem
Given: A set S of n items, with each item i having
 bi - a positive benefit
 wi - a positive weight
Goal: Choose items with maximum total benefit but with
weight at most W.
If we are allowed to take fractional amounts, then this is
the fractional knapsack problem.
 In this case, we let xi denote the amount we take of item i

 Objective: maximize b (x / w )
iS
i i i

 Constraint: x
iS
i W
The Greedy Method 5
Example
Given: A set S of n items, with each item i having
 bi - a positive benefit
 wi - a positive weight
Goal: Choose items with maximum total benefit but with
weight at most W.
“knapsack”

Solution:
• 1 ml of 5
Items:
1 2 3 4 5 • 2 ml of 3
• 6 ml of 4
Weight: 4 ml 8 ml 2 ml 6 ml 1 ml • 1 ml of 2
Benefit: $12 $32 $40 $30 $50 10 ml
Value: 3 4 20 5 50
($ per ml)
The Greedy Method 6
The Fractional Knapsack
Algorithm
Greedy choice: Keep taking
item with highest value Algorithm fractionalKnapsack(S, W)
(benefit to weight ratio) Input: set S of items w/ benefit bi
 Use a heap-based priority and weight wi; max. weight W
queue to store the items, then
the time complexity is O(n log Output: amount xi of each item i
n). to maximize benefit with
weight at most W
Correctness: Suppose there for each item i in S
is a better solution xi  0
 there is an item i with higher
value than a chosen item j vi  bi / wi {value}
(i.e., vj<vi) , if we replace w0 {current total weight}
some j with i, we get a better while w < W
solution remove item i with highest vi
 Thus, there is no better
solution than the greedy one xi  min{wi , W  w}
w  w + min{wi , W  w}
The Greedy Method 7
Task Scheduling
Given: a set T of n tasks, each having:
 A start time, si
 A finish time, fi (where si < fi)
Goal: Perform all the tasks using a minimum number of
“machines.”

Machine 3
Machine 2
Machine 1

1 2 3 4 5 6 7 8 9

The Greedy Method 8


Task Scheduling
Algorithm
Greedy choice: consider tasks by
their start time and use as few
machines as possible with this order.
 Run time: O(n log n). Algorithm taskSchedule(T)
Correctness: Suppose there is a Input: set T of tasks w/ start time si
better schedule. and finish time fi
 We can use k-1 machines
Output: non-conflicting schedule
 The algorithm uses k with minimum number of machines
 Let i be first task scheduled on m0 {no. of machines}
machine k while T is not empty
 Task i must conflict with k-1
remove task i w/ smallest si
other tasks
 K mutually conflict tasks if there’s a machine j for i then
 But that means there is no non- schedule i on machine j
conflicting schedule using k-1 else
machines mm1
schedule i on machine m
The Greedy Method 9
Example
Given: a set T of n tasks, each having:
 A start time, si
 A finish time, fi (where si < fi)
 [1,4], [1,3], [2,5], [3,7], [4,7], [6,9], [7,8] (ordered by start)
Goal: Perform all tasks on min. number of machines

Machine 3
Machine 2
Machine 1

1 2 3 4 5 6 7 8 9

The Greedy Method 10

You might also like