15-451 Algorithms 11/11/19
* Online algorithms
- rent-or-buy?
- elevator problem
- paging
==========================================================================
Today's topic: Online Algorithms
Last time: looked at algorithms for finding approximately-optimal
solutions for NP-hard problems. Today: finding approximately-optimal
solutions for problems where difficulty is that the algorithm doesn't
have all the information up front.
Online algorithms: Algorithms for settings where inputs/data arriving
over time. Need to make decisions on the fly, without knowing what
will happen in the future. (As opposed to standard problems like
sorting where you have all inputs at the start.) Data structures are
one example of this. We'll talk about a few other examples today.
Rent-or-buy?
============
Simple online problem that captures a common issue in online
decision-making, called the rent-or-buy problem. Say you are just
starting to go skiing. Can either rent skis for $50 or buy for $500.
You don't know if you'll like it, so you decide to rent. Then you
decide to go again, and again, and after a while you realize you have
shelled out a lot of money renting and you wish you had bought right at the
start. Optimal strategy is: if you know you're going to end up skiing more
than 10 times, you should buy right at the beginning. If you know
you're going to go < 10 times, you should just rent. But, what if you
don't know?
To talk about quality of an online algorithm, we can look at what's
called the "competitive ratio":
Competitive ratio is worst case (maximum) over possible futures of
the ratio: (alg cost)/OPT, where OPT = optimal cost in hindsight.
"cost" means total cost over all time.
E.g., what is CR of algorithm that says "buy right away"?
Worst case is: only go once. Ratio is 500/50 = 10.
What about algorithm that says "Rent forever"?
Worst case is: keep going skiing. Ratio is infinite.
Here's a nice strategy: rent until you realize you should have bought,
then buy. (In our case: rent 9 times, then buy).
Case 1: If you went skiing 9 or less times, you're optimal.
Case 2: If you went 10+ times, you paid $450 + $500. Opt paid $500.
Ratio = 2 - 1/10. In general, if purchase cost p is a multiple
of rental cost r, the ratio is ((p-r)+p)/p = 2 - r/p.
Worst of these is case 2, so competitive ratio is 2- r/p.
Claim: above strategy has the best possible competitive ratio for
deterministic algorithms.
Proof: Consider the event that the day you purchase is the last day
you go skiing. If you rent longer than the above strategy, then the
numerator goes up but the denominator stays the same, so your ratio is
worse. If you rent fewer times, then the numerator goes down by r but
so does the denominator, so again the ratio is worse.
The elevator problem
====================
You go up to the elevator and press the button. But who knows how
long it's going to take to come, if ever? How long should you wait
until you give up and take the stairs?
Say it takes time E to get to your floor by elevator (once it comes)
and it takes time S by stairs. E.g, maybe E = 15 sec, S = 45 sec.
What strategy has the best competitive ratio?
- wait 30 sec, then take stairs. (in general, wait for S-E time)
(I.e., take the stairs once you realize you should have taken
them at the start)
- if elevator comes in < 30 sec, we're optimal.
- otherwise, OPT = 45. We took 30+45 sec, so ratio = (30+45)/45 = 5/3
- Or, in general, ratio = (S-E+S)/S = 2 - E/S.
This is really the same as rent-or-buy where stairs=buy, waiting for E
time steps is like renting, and the elevator arriving is like the last
time you ever ski. So, this algorithm is optimal for the same reason.
Other problems like this: whether it's worth optimizing code, when
your laptop should stop spinning the disk between accesses, and many others.
Interesting article in NYT Sept 29, 2007: Talking about a book by
Jason Zweig on how people's emotions affect their investing.called
"Your money and your brain"
"There is a story in the book about Harry Markowitz, Mr. Zweig said
the other day. He was referring to the renowned economist who shared
a Nobel for helping found modern portfolio theory and proving the
importance of diversification.... Mr. Markowitz was then working at
the RAND Corporation and trying to figure out how to allocate his
retirement account. He knew what he should do: I should have
computed the historical co-variances of the asset classes and drawn
an efficient frontier. (That's efficient-market talk for draining as
much risk as possible out of his portfolio.)"
"But, he said, I visualized my grief if the stock market went way up
and I wasn't in it or if it went way down and I was completely in
it. So I split my contributions 50/50 between stocks and bonds. As
Mr. Zweig notes dryly, Mr. Markowitz had proved incapable of
applying his breakthrough theory to his own money."
So, he wasn't applying his own theory but he *was* using competitive
analysis: 50/50 guarantees you end up with at least half as much as if
you had known in advance what would happen, which is best possible C.R.
Paging:
=======
In paging, we have a disk with N pages. Fast memory with space for k
pages. When a memory request is made, if the page isn't in the fast
memory, we have a page fault. We then need to bring the page into the
fast memory and throw something else out if our space is full. Our
goal is to minimize the number of misses. The
algorithmic question is: what should we throw out? E.g., say k=3 and
the request sequence is 1,2,3,2,4,3,4,1,2,3,4. What would be the
right thing to do in this case if we knew the future? Answer: throw
out the thing whose next request is farthest in the future.
A standard online algorithm is LRU: "throw out the least recently used
page". E.g., what would it do on above case? What's a bad case for
it? 1,2,3,4,1,2,3,4,1,2,3,4... Notice that in this case, the
ratio=k. Can show this is actually the worst-case for LRU, so LRU
gets a CR of k (but won't prove it here).
Can actually show that you can't do better with a deterministic
algorithm.
Here is a neat randomized algorithm with a comp. ratio of O(log k).
Marking:
- Start with all pages unmarked, k random pages in fast memory.
- When a page is requested,
- if it's in fast memory already, mark it.
- if it's not, then throw out a random unmarked page.
(if all pages in fast memory are marked, unmark everything
first. For analysis purposes, call this the end of a "phase")
then mark the page brought in.
Can think of marks as "recently used" vs "not recently used".
Proof for N=k+1. In a phase, you have k+1 different pages
requested so OPT has at least one page fault. For the algorithm, the
page not in fast memory is a *random* unmarked page. Every time a
page is requested: if it was already marked, then no page fault for
sure. If it wasn't marked, then the probability of a page fault is
1/i where i is the number of unmarked pages. So, within a phase, the
expected total cost is 1/k + 1/(k-1) + ... + 1/2 + 1 = O(log k).
For general N, the proof follows similar lines but just is a little
bit more complicated.