## 10715 - Cat

All about problems in Volume 107. If there is a thread about your problem, please use it. If not, create one with its number in the subject.

Moderator: Board moderators

Verdan
New poster
Posts: 5
Joined: Wed Sep 17, 2003 10:56 am

### 10715 - Cat

Can you tell me what I'm doing wrong ?
1. Normalizing all weights Wi = Wi / Sum(Wi=1..n).
2. Sorting of weights.
3. Recursively find solution with summary weigh 0.49..0.51.

I've got WA.

Maybe I should print bigger or smaller solution ?
Should I sort numbers of rocks in answer ?
little joey
Guru
Posts: 1080
Joined: Thu Dec 19, 2002 7:37 pm
The total weight of ballast in the port hull should not differ from that in the starboard hull by more than 2%.
This is something else then that the difference should be smaller than 2% of the total weight. Looking for an answer between 0.496 and 0.504 would be more accurate.
Animosity
New poster
Posts: 2
Joined: Thu Sep 23, 2004 4:25 pm
ok...where does my logic fail me?

Since there will always be a solution i'm using the following simple algorithm:

-sort weights in decreasing order
-have 2 sums s1=s2=0
-s1 += heaviest rock
-s2 += next heaviest
-recursively...compare s1 to s2 and add the next rock to whichever is smaller
- we end up with 2 sums that are close to each other

Working with the given example in the problem:
rocks in decreasing order: 90.0,50.0,38.0,10.0,7.1
s1 = 90.0; s2 = 50.0;
s2 += 38.0;
s2 += 10.0;
s1 += 7.1;
and we end up with s1 = 97.1 and s2 = 98.0
Per
A great helper
Posts: 429
Joined: Fri Nov 29, 2002 11:27 pm
Location: Sweden
If I understand your logic correctly, it fails when the weights are for instance:

3 3 2 2 2
Animosity
New poster
Posts: 2
Joined: Thu Sep 23, 2004 4:25 pm
so true..too bad i didnt thought of it in time
sohel
Guru
Posts: 856
Joined: Thu Jan 30, 2003 5:50 am
Location: New York

### TLE..

I am also using the technique Verdan used and also made ammendments to my code with the modification that little joey added,
but I am getting TLE.

I will post my code here, cos it's relatively small..
btw I am with those who are against 'posting of codes' but finding no other alternative, I resorted to it.

[c]
// Name : Cat
// Number : 10715
// Author : Sohel Hafiz
// Date : 2004-11-08
// Algorithm : Backtracking

#include<stdio.h>
#include<algorithm>

using namespace std;
struct node{
double d;
double r;
int index;
};

node N[105];
int t;
int A[105];
int yes;

void dfs(int u, int d, double sum) {
if(yes) return;
if( sum > 0.504 ) return;
int i;
if( sum >= 0.496 && sum <= 0.504) {
for(i=0;i<d;i++) {
if( i ) printf(" ");
printf("%d", A + 1);
}
printf("\n");
yes = 1;
return;
}
if( u == t) return;
A[d] = N.index ;

dfs(u+1, d+1, sum + N.r);
dfs(u+1, d, sum);

}

bool comp(const node &a, const node &b) {
return a.r > b.r;
}

int main() {
int i;
double total;
while(1) {
scanf("%d",&t);
total = 0;
if( t == 0 ) break;
for(i=0;i<t;i++) {
scanf("%lf", &N.d);
total += N.d;
N.index = i;
}
for(i=0;i<t;i++) {
N.r = N.d / total;
}
sort( N, N + t, comp);
yes = 0;
dfs(0, 0, 0);

}
return 0;
}
[/c]

Some help will be appreciated.
Abednego
A great helper
Posts: 281
Joined: Tue Sep 10, 2002 5:14 am
Location: Mountain View, CA, USA
Contact:
Sohel, doesn't your algorithm make 2^100 calls to dfs()? Even with the branch-and-bound that doesn't allow the sum to go over 0.504.

What I did was round every weight to 2 decimal places, multiply them by 100 and do the simple coin change dispenser on it (with dynamic programming). But it gives me WA. My logic is:
1. The total sum of the weights is at most 10000.
2. Half of that is 5000.
3. 2% of that is 100.
4. Divided by the number of stones, it is 1 - the maximum allowed error per stone.
5. So if I have an error of 0.1 per stone, that should be well within the 2% bound.
6. Multiplying each weight by 100 and rounding gives an error of at most 0.005 per stone.

Hmmm. Did I mess up the coding, or is there something wrong with the logic?
If only I had as much free time as I did in college...
Abednego
A great helper
Posts: 281
Joined: Tue Sep 10, 2002 5:14 am
Location: Mountain View, CA, USA
Contact:
Ok. Normalizing first, like Verdan suggested, then scaling by a large number and rounding helped. But I'm still confused why the first method didn't work.
If only I had as much free time as I did in college...
IgorOstrovsky
New poster
Posts: 2
Joined: Tue Dec 14, 2004 5:14 am
1. The total sum of the weights is at most 10000.
2. Half of that is 5000.
3. 2% of that is 100.
4. Divided by the number of stones, it is 1 - the maximum allowed error per stone.
5. So if I have an error of 0.1 per stone, that should be well within the 2% bound.
6. Multiplying each weight by 100 and rounding gives an error of at most 0.005 per stone.
Abednego, the problem with your error bound calculation is in your step 1: you assume the maximum sum of weights. If the sum of weights is smaller, all the numbers in your calculation, including the error per stone, will be smaller proportionally.

Igor (the other one )
Abednego
A great helper
Posts: 281
Joined: Tue Sep 10, 2002 5:14 am
Location: Mountain View, CA, USA
Contact:
Ah! Of course. That makes sense.
If only I had as much free time as I did in college...
yohney
New poster
Posts: 3
Joined: Thu Oct 14, 2004 9:52 am
ok, can somebody help me a bit? Let's say i somehow got weights for example 178 and 180. what it means 2% off?

does it mean if -> 1.02 * 178 >= 180 <- then it is OK, or what?
LithiumDex
New poster
Posts: 44
Joined: Tue Jun 06, 2006 6:44 pm
Location: Nova Scotia, Canada
Contact:
After a little bit of a headache, and a few *cough* submissions, I solved it in 0.031 -- With a GA (that's Genetic Algorithm)

I used to obsessed with GA's... when all else failed, I would write a GA... let's hope this doesn't cause a relapse, lol