Page 1 of 1

10715 - Cat

Posted: Tue Sep 21, 2004 8:25 pm
by Verdan
Can you tell me what I'm doing wrong ?
1. Normalizing all weights Wi = Wi / Sum(Wi=1..n).
2. Sorting of weights.
3. Recursively find solution with summary weigh 0.49..0.51.

I've got WA.

Maybe I should print bigger or smaller solution ?
Should I sort numbers of rocks in answer ?

Posted: Tue Sep 21, 2004 9:46 pm
by little joey
The total weight of ballast in the port hull should not differ from that in the starboard hull by more than 2%.
This is something else then that the difference should be smaller than 2% of the total weight. Looking for an answer between 0.496 and 0.504 would be more accurate.

Posted: Thu Sep 23, 2004 4:33 pm
by Animosity
ok...where does my logic fail me?

Since there will always be a solution i'm using the following simple algorithm:

-sort weights in decreasing order
-have 2 sums s1=s2=0
-s1 += heaviest rock
-s2 += next heaviest
-recursively...compare s1 to s2 and add the next rock to whichever is smaller
- we end up with 2 sums that are close to each other

Working with the given example in the problem:
rocks in decreasing order: 90.0,50.0,38.0,10.0,7.1
s1 = 90.0; s2 = 50.0;
s2 += 38.0;
s2 += 10.0;
s1 += 7.1;
and we end up with s1 = 97.1 and s2 = 98.0

Posted: Thu Sep 23, 2004 5:37 pm
by Per
If I understand your logic correctly, it fails when the weights are for instance:

3 3 2 2 2

Posted: Thu Sep 23, 2004 9:12 pm
by Animosity
so true..too bad i didnt thought of it in time

TLE..

Posted: Mon Nov 08, 2004 12:47 pm
by sohel
I am also using the technique Verdan used and also made ammendments to my code with the modification that little joey added,
but I am getting TLE.

I will post my code here, cos it's relatively small..
btw I am with those who are against 'posting of codes' but finding no other alternative, I resorted to it.

[c]
// Name : Cat
// Number : 10715
// Author : Sohel Hafiz
// Date : 2004-11-08
// Algorithm : Backtracking

#include<stdio.h>
#include<algorithm>

using namespace std;
struct node{
double d;
double r;
int index;
};

node N[105];
int t;
int A[105];
int yes;

void dfs(int u, int d, double sum) {
if(yes) return;
if( sum > 0.504 ) return;
int i;
if( sum >= 0.496 && sum <= 0.504) {
for(i=0;i<d;i++) {
if( i ) printf(" ");
printf("%d", A + 1);
}
printf("\n");
yes = 1;
return;
}
if( u == t) return;
A[d] = N.index ;

dfs(u+1, d+1, sum + N.r);
dfs(u+1, d, sum);

}

bool comp(const node &a, const node &b) {
return a.r > b.r;
}

int main() {
int i;
double total;
while(1) {
scanf("%d",&t);
total = 0;
if( t == 0 ) break;
for(i=0;i<t;i++) {
scanf("%lf", &N.d);
total += N.d;
N.index = i;
}
for(i=0;i<t;i++) {
N.r = N.d / total;
}
sort( N, N + t, comp);
yes = 0;
dfs(0, 0, 0);

}
return 0;
}
[/c]

Some help will be appreciated.

Posted: Fri Dec 10, 2004 12:04 am
by Abednego
Sohel, doesn't your algorithm make 2^100 calls to dfs()? Even with the branch-and-bound that doesn't allow the sum to go over 0.504.

What I did was round every weight to 2 decimal places, multiply them by 100 and do the simple coin change dispenser on it (with dynamic programming). But it gives me WA. My logic is:
1. The total sum of the weights is at most 10000.
2. Half of that is 5000.
3. 2% of that is 100.
4. Divided by the number of stones, it is 1 - the maximum allowed error per stone.
5. So if I have an error of 0.1 per stone, that should be well within the 2% bound.
6. Multiplying each weight by 100 and rounding gives an error of at most 0.005 per stone.

Hmmm. Did I mess up the coding, or is there something wrong with the logic?

Posted: Fri Dec 10, 2004 12:31 am
by Abednego
Ok. Normalizing first, like Verdan suggested, then scaling by a large number and rounding helped. But I'm still confused why the first method didn't work.

Posted: Tue Dec 14, 2004 5:50 am
by IgorOstrovsky
1. The total sum of the weights is at most 10000.
2. Half of that is 5000.
3. 2% of that is 100.
4. Divided by the number of stones, it is 1 - the maximum allowed error per stone.
5. So if I have an error of 0.1 per stone, that should be well within the 2% bound.
6. Multiplying each weight by 100 and rounding gives an error of at most 0.005 per stone.
Abednego, the problem with your error bound calculation is in your step 1: you assume the maximum sum of weights. If the sum of weights is smaller, all the numbers in your calculation, including the error per stone, will be smaller proportionally.

Igor (the other one :-))

Posted: Tue Dec 14, 2004 6:10 am
by Abednego
Ah! Of course. That makes sense.

Posted: Wed Jun 15, 2005 8:38 pm
by yohney
ok, can somebody help me a bit? Let's say i somehow got weights for example 178 and 180. what it means 2% off?

does it mean if -> 1.02 * 178 >= 180 <- then it is OK, or what? :wink:

Posted: Sat May 26, 2007 11:21 pm
by LithiumDex
After a little bit of a headache, and a few *cough* submissions, I solved it in 0.031 -- With a GA (that's Genetic Algorithm)

I used to obsessed with GA's... when all else failed, I would write a GA... let's hope this doesn't cause a relapse, lol :P