Stuck on certain DP problems.

Moderator: Board moderators

uha
New poster
Posts: 12
Joined: Mon Mar 22, 2004 3:35 am

Stuck on certain DP problems.

Problems that ask you to partition a set into two mutually disjoint subsets such that the difference between the cumulative weights of the items in each subset is as small as possible. I have an idea of what ought to be done but I can never convince myself of the DP formulation.

Examples are:

10032 - Tug of War

562 - Dividing Coins

I think that the decision at each stage is what subset the i-th item goes into. I would then have to remember the total weight of items in each subset and I would want to minimize the difference between their weights.

That's my idea but I'm not comfortable with it. There are problems when I try to write a sensible recurrence based on it. Does anyone have a different way of looking at this type of problem?

Thanks for any help.

gvcormac
Problemsetter & Reviewer
Posts: 194
Joined: Fri Mar 15, 2002 2:00 am
Contact:

Re: Stuck on certain DP problems.

uha wrote:Problems that ask you to partition a set into two mutually disjoint subsets such that the difference between the cumulative weights of the items in each subset is as small as possible. I have an idea of what ought to be done but I can never convince myself of the DP formulation.

Examples are:

10032 - Tug of War

562 - Dividing Coins

I think that the decision at each stage is what subset the i-th item goes into. I would then have to remember the total weight of items in each subset and I would want to minimize the difference between their weights.

That's my idea but I'm not comfortable with it. There are problems when I try to write a sensible recurrence based on it. Does anyone have a different way of looking at this type of problem?

Thanks for any help.
The problem you paraphrased is a straightforward knapsack problem. You create a boolean vector indicating all possible cumulative weights, then pick the one closest to the middle. To fill in the table, take each element in turn and add its weight to every cumulative weight in the table, to form a new table. You can do this in place if you consider the weights in the table in decreasing order.

Tug-of-war is a little bit more complicated than that because it also requires that the subsets have equal cardinality. So instead of tabulating just the possible cumulative weights, you must tabulate all combinations of (number of people, cumulative weight). At the end you pick the closest cumulative weight for which the number of people is correct.

Larry
Guru
Posts: 647
Joined: Wed Jun 26, 2002 10:12 pm
Location: Hong Kong and New York City
Contact:
Regular DP doesn't work on those solutions though, I've gotten TLE taking the naive DP method as Professor Cormack described.. it seems unforunate that a careful pruning is the intended solution.. =/
Last edited by Larry on Sun Apr 11, 2004 12:05 am, edited 1 time in total.

gvcormac
Problemsetter & Reviewer
Posts: 194
Joined: Fri Mar 15, 2002 2:00 am
Contact:
Larry wrote:Regular DP doesn't work on those solutions though, I've gotten TLE taking the naive DP method as Professor Cormac described.. it seems unforunate that a careful pruning is the intended solution.. =/
(Minor point: it is Cormack, not Cormac - my userid is shortened.)

I can only speak of "intent" with respect to Tug of War, because I composed the problem. My intent was that it be a straightforward DP problem. The Waterloo environment gives 30 seconds per test case.

When the problem was adapted for uva, it was rewritten to have multiple input cases per run, with a total time limit of 10 seconds (on about the same speed of machine). As it happens, my solution, modified only to handle the multiple input, runs in 9.5 seconds. If I modify it so as to avoid re-initializing the 500,000 element table, it runs in 8.5 seconds.

So the process of adaptation, inadvertently I think, tightened the time limit so as to exclude solutions that used the correct algorithm but happened to be slower by a small constant factor. This is unfortunate.

gvcormac
Problemsetter & Reviewer
Posts: 194
Joined: Fri Mar 15, 2002 2:00 am
Contact:
Larry wrote:Regular DP doesn't work on those solutions though, I've gotten TLE taking the naive DP method as Professor Cormac described.. it seems unforunate that a careful pruning is the intended solution.. =/
My implementation of the solution I described to "Dividing Coins" runs in 0.820 seconds.

Larry
Guru
Posts: 647
Joined: Wed Jun 26, 2002 10:12 pm
Location: Hong Kong and New York City
Contact:
You're right.. I actually have the code accepted in 0.4 seconds. I mistakenly remembered another problem.

I'll try again for the former problem, because I just realized I originally did this in Java.. so I'll rewrite it in C and see if it works..

I actually learned this techique from topcoder, I apologize for misleading anyone!

(I talked to you a few times in topcoder a long time ago, so I apologize for misremember your name!)

ranjit
New poster
Posts: 34
Joined: Fri Jan 30, 2004 11:22 am
Location: india

stuck on certain dp problems

can somebody elaborately explain how to solve 562(dividing coins).
I cant convince myself similar to uha

gvcormac
Problemsetter & Reviewer
Posts: 194
Joined: Fri Mar 15, 2002 2:00 am
Contact:

Re: stuck on certain dp problems

ranjit wrote:can somebody elaborately explain how to solve 562(dividing coins).
I cant convince myself similar to uha

int MAX_SUM = 500*100
bool possible_sum[MAX_SUM+1]

possible_sum[0] = true; (all others false)

for each coin
for s from MAX_SUM - coin downto 0
if possible_sum[s] then possible_sum[s+coin] = true

find s such that possible_sum[s] is true and s
is closest to 1/2 the total of all coins

ranjit
New poster
Posts: 34
Joined: Fri Jan 30, 2004 11:22 am
Location: india
thanx a lot,sir.
i got the problem accepted but i feel u shouldn't have posted the algorithm itself.

Thanks for the helping tendency exhibited.

bye.

uha
New poster
Posts: 12
Joined: Mon Mar 22, 2004 3:35 am
Wow!

Thanks alot for the many helpful replies. Also thanks gvcormac for the 'dividing coins' pseudocode. These problems are easily stated but surprisingly difficult to solve in time without a good algorithm. Often, it is easy to see where dp might apply but not so easy to actually follow through.

ranjit
New poster
Posts: 34
Joined: Fri Jan 30, 2004 11:22 am
Location: india

game theory and dynamic programming

hi,

there is a problem 10404 which involves creation of a similar boolean vector.
i wrote a code using dynamic programming but got tle.it was O(n).
i feel that the problem is more mathematical(may be number theory).

hope someone can help.

bye.

sumankar
A great helper
Posts: 286
Joined: Tue Mar 25, 2003 8:36 am
Location: calcutta
Contact:

Hello Ranjit

Do go through the board, you'll find lots of
info regarding 10404.

I dont know if DP is a good choice here,
but do try other methods!

Suman

Julien Cornebise
Experienced poster
Posts: 145
Joined: Sat Feb 23, 2002 2:00 am
Location: Paris, France
Contact:

Pb 562, other algorithm, but ...

Good morning everybody

I've solved (AC in 1.4 sec) 562 (thanks little joey ) with another algorithm, but there is one thing that I don't understand :

Code: Select all

``````int MAX_SUM = 500*100
bool possible_differenceMAX_SUM+1]

possible_difference [first coin]= 1; (all other 0)

for each coin from second to last {
new_possible_difference [0...MAX_SUM``````
Last edited by Julien Cornebise on Wed Jul 21, 2004 12:29 pm, edited 1 time in total.

little joey
Guru
Posts: 1080
Joined: Thu Dec 19, 2002 7:37 pm
511 (Do you know the way to San Jose?) is not (yet) submittable

I guess you mean 562.

Your pseudo code is a little blurry, if you don't mind me saying so. I suspect range errors in expressions like new_possible_difference[d+coin], when d goes from 0 to MAX_SUM+1, while the array has only MAX_SUM+1 elements...

Maybe you could (temporarily) give your actual code.

But besides that, my array possible[1..50000] has exacly 50000 elements (it's in Pascal) and calculates the answer without WA...

Oh yeah, another thing: let TOTSUM be the sum of all coin values, then you only have to consider all possibilities upto TOTSUM div 2, because possible is equal to possible[TOTSUM-i]. That speeds thing up compared to always considering 50000 values.

Julien Cornebise
Experienced poster
Posts: 145
Joined: Sat Feb 23, 2002 2:00 am
Location: Paris, France
Contact:
Hi Little Joey.

I admit my pseudo code is a bit messy. Sorry about that :/
Here's temporarily my source code (I'll remove it as soon as somebody will answer). It might have a few out-of-bounds troubles, but nothing extremly important.

I'm surprised, because I thought one could use MAXCOST+1 instead of MAXCOST*MAX....
[cpp]
#include <stdio.h>
#include <string.h>
#include <functional>
#include <algorithm>
using namespace std;

#define MAX 101
#define MAXCOST 501

int nbpieces;
int pieces[MAX];
int * diff, * newdiff, * temp;

int main() {
int nbscenar;
int i,j;

newdiff = (int*)calloc(MAX*MAXCOST, sizeof(int));
diff = (int*)calloc(MAX*MAXCOST, sizeof(int));

for (scanf("%d", &nbscenar); nbscenar>0; nbscenar--) {
scanf("%d", &nbpieces);
for (i=0; i<nbpieces; i++)
scanf("%d", &pieces);

memset(diff, 0, MAX*MAXCOST*sizeof(int));
memset(newdiff, 0, MAX*MAXCOST*sizeof(int));

if (nbpieces <= 0) { printf("0\n"); continue; } // case 0 coins

diff[pieces[0]] = 1; // with only 1 coin, the only difference possible is the amount of this coin
for (i=1; i<nbpieces; i++) {

for (j=0; j<MAX*MAXCOST; j++)
// QUESTION : Why here do I have to go up to MAX*MAXCOST ? And not only up to MAXCOST ??? Only differences up to MAXCOST seem useful to me, but it gives me WA if i use the line below :
// for (j=0; j<=MAXCOST; j++)
if (diff[j] != 0) {
diff[j] = 0; // to avoid having to reset diff each time.
newdiff[pieces < j ? j-pieces : pieces-j] = 1;
newdiff[pieces + j] = 1;
}
// new becomes old, ready for next loop
temp = newdiff;
newdiff = diff;
diff = temp;
}
// find the mininum difference that can be done
for (i=0; diff==0; i++);
printf("%d\n", i);
}
return 0;
}
[/cpp]