Saint Louis University |
Computer Science 3100
|
Dept. of Computer Science |
Topics: Dynamic Programming and Greedy Algorithms
Related Reading: Chapters 15, 16 of CLRS
Due:
10:00am, Monday, October 10, 2016
You must also adhere to the policies on academic integrity, paying particular attention to the limits on collaboration.
Problem 15-3 (on page 405 of CLRS).
We will give you a more specific hint. Presume that we number the
points p1, ..., pn, when
ordered from left to right. Let notation
Your tasks are as follows:
During hard times, a company with n employees is planning to cut costs through massive layoffs. The company is organized using a classic hierarchy that can be modeled as a general tree, with everyone (other than the president) reporting directly to one other person. The Board of Trustees is considering whom to layoff (including, perhaps, the president), however to make sure that responsibilities for current work don't get reassigned too far, they have decided that they will never fire an employee and his or her direct manager.
Given this constraint, their goal is to shed as much salary as possible through the layoffs. For notation, let sp denote the current salary for person p. There are 2n possible subsets of employees to consider laying off (not that all of those subsets will meet the constraint about an employee and manager), but a brute force search through all of those subsets is quite expensive. Fortunately, dynamic programming can be used to find the optimal subset of employees to fire in considerably less time.
You are to describe such an efficient solution using dynamic programming. In evaluating your writeup, we will be looking for how you address the following issues:
Problem 16-2 (on page 447 of CLRS).
Although the book asks you to design and analyze two algorithms, I'll help you out by giving you both algorithms. Your only responsibility is to provide a rigorous proof of optimality for each.
Run the tasks in nondecreasing order of their processing times.
During any fixed unit of time, run a portion of a released, yet unfinished, task having the smallest remaining processing time (where the remaining processing time is equal to the original processing time minus any time that the task has been processed thus far).
Note as well that minimizing the average completion time is equivalent to minimizing the sum of the completion times, since the number of jobs is fixed. So you are welcome to argue about the sum of the completion times as a convenient metric.
Consider the following optimization problem. Someone is in charge of collecting all of the flags left interspersed on the ski slope. They must do so by skiing to and grabbing each flag. The person may grab more than one flag on a single run down the hill, but given that you cannot ski uphill, it may take more than one ski run to gather all the flags. We would like to be able to suggest a plan that uses as few runs as possible.
We model the problem as follows. The slope itself will be viewed as an n-by-n grid. The top of the ski-slope will be the top-left corner of the grid, and the hill is sloped so that a skier can move either downward or rightward at each individual step, until eventually ending up at the bottom-right corner of the grid. The flags will be placed at certain grid locations and the collector will be informed of the overall number of flags and the grid coordinates of each flag before beginning.
Someone has suggested the following greedy approach: When planning the first run, calculate a path that maximizes the number of flags that will be collected on that run (let's not worry about precisely how we calculate this local solution -- just assume that we have a way to find such a path and use it). With those flags removed, repeat this approach, choosing a second run that maximizes the number of remaining flags that can be collected, and so on until collecting all flags. If there is ever a tie in choosing the path with the most flags, you may assume an arbitrary such path will be chosen.
An example of this approach is diagrammed as follows:
Although the 3 runs produced by the algorithm on this example are optimal, the algorithm does not always achieve a solution with the fewest number of overall runs. Demonstrate the suboptimality of the strategy as follows:
Give an explicit instance of the problem, specifying the size of the grid and the exact placements of the flags within that grid.
Clearly diagram the order of runs that you suggest would be chosen according the the greedy algorithm outlined.
In a separate diagram, clearly identify a strictly smaller collection of runs that can be used to collect all flags.
Consider the skiing problem introduced in the previous exercise. Another greedy algorithm has been suggested for minimizing the number of runs. Each run is built with the following rules. Starting at the top-left corner, the next step will be rightward if there remains one or more flags somewhere further right in the current row, or trivially if the run has already reached the bottom row. Otherwise the next step taken is downward.
Give a rigorous proof that this rule succeeds in producing a solution with a minimal number of runs for any problem instance.
Implement a solution to the following programming problem, which can be solved successfully using dynamic programming: Narrow Art Gallery. Your software is expected to take accept a single test case from standard input, and to produce the corresponding output to standard out.
Note that you can verify the success of your implementation, either by submitting online at the kattis.com website, or by executing the command
/public/goldwasser/icpc/submit narrowartgallery.SUFFIXwhere SUFFIX is either java, cpp, or py, reflecting your choice of programming language.