Download presentation

Presentation is loading. Please wait.

Published byTerrence Dodson Modified over 6 years ago

1
1.1 Data Structure and Algorithm Lecture 6 Greedy Algorithm Topics Reference: Introduction to Algorithm by Cormen Chapter 17: Greedy Algorithm

2
1.2 Data Structure and Algorithm Greedy Algorithm Algorithm for optimization problems typically go through a sequence of steps, with a set of choices at each step. For many optimization problems, greedy algorithm can be used. (not always) Greedy algorithm always makes the choice that looks best at the moment. It makes a locally optimal choice in the hope that this choice will lead to a globally optimal solution. Example: Activity Selection Problem Dijkstra’s Shortest Path Problem Minimum Spanning Tree Problem

3
1.3 Data Structure and Algorithm Activity Selection Problem Definition: Scheduling a resource among several competing activities. Elaboration: Suppose we have a set S = {1,2,…,n} of n proposed activities that wish to use a resource, such as a lecture hall, which can be used by only one activity at a time. Each activity i has a start time s i and finish time f i where s i <= f i. Compatibility: Activities i and j are compatible if the interval [s i, f i ) and [s j, f j ) do not overlap (i.e. s i >= f j or s j >= f i ) Goal: To select a maximum- size set of mutually compatible activities.

4
1.4 Data Structure and Algorithm Activity Selection Problem (Cont.) Assume that Input activities are sorted by increasing finishing time. [ complexity O(nlg 2 n) ] [ s and f are starting and finishing time array respectively] Activity_Selector (s, f) Complexity=O(n) n = length(s) A = {1} j = 1 for i = 2 to n do if s i >= f j then A = A U {i} j = i return A

5
1.5 Data Structure and Algorithm Activity Selection Problem (Cont.) The next selected activity is always the one with the earliest finish time that can be legally scheduled. The activity picked is thus a greedy choice in the sense that it leaves as much opportunity as possible for the remaining activities to be scheduled. That is, the greedy choice is the one that maximizes the amount of unscheduled time remaining.

6
1.6 Data Structure and Algorithm Elements of the Greedy Strategy A greedy algorithm obtains an optimal solution by making a sequence of choices. The choice that seems best at the moment is chosen. This strategy does not always produces an optimal solution. Then how can one tell if a greedy algorithm will solve a particular optimization problem??

7
1.7 Data Structure and Algorithm Elements of the Greedy Strategy (Cont.) How can one tell if a greedy algorithm will solve a particular optimization problem?? There is no way in general. But there are 2 ingredients exhibited by most greedy problems: 1. Greedy Choice Property 2. Optimal Sub Structure

8
1.8 Data Structure and Algorithm Greedy Choice Property A globally optimal solution can be arrived at by making a locally optimal (Greedy) choice. We make whatever choice seems best at the moment and then solve the sub problems arising after the choice is made. The choice made by a greedy algorithm may depend on choices so far, by it cannot depend on any future choices or on the solutions to sub problems. Thus, a greedy strategy usually progresses in a top- down fashion, making one greedy choice after another, iteratively reducing each given problem instance to a smaller one.

9
1.9 Data Structure and Algorithm Optimal Sub Structure A problem exhibits optimal substructure if an optimal solution to the problem contains (within it) optimal solution to sub problems. In Activity Selection Problem, an optimal solution A begins with activity 1, then the set of activities Ā = A – {1} is an optimal solution to the activity selection problem Ś = {i € S: s i >= f 1 }

10
1.10 Data Structure and Algorithm Knapsack Problem (Fractional) We are given n objects and a knapsack. Object i has a weight w i and the knapsack has a capacity M. If a fraction x i, [0<= x i <=1] of object i is placed into the knapsack then a profit of p i x i is earned. The objective is to obtain a filling of the knapsack that maximize the total profit earned. Maximize ∑ p i x i [1<= i <= n ] subject to ∑ w i x i <= M [1<= i <= n ]

11
1.11 Data Structure and Algorithm Knapsack Problem (Fractional) [ P = profit array of the objects, W = weight array of the objects, X = Object Array, n=number of objects, M= knapsack capacity ] [objects are ordered so that P(i)/W(i) >= P(i+1)/W(i+1)] Knapsack(M, n) [ Complexity= O(n) ] X = 0 //initialize object vector (array) cu = M //remaining knapsack capacity for i = 1 to n do if W(i) >cu then Exit X(i) = 1 cu = cu – W(i) if i<=n then X(i) = cu/W(i)

Similar presentations

© 2021 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google