### Re: 10089 - Repackaging

Posted:

**Sun Aug 23, 2009 11:33 pm**By the way, try problem 802 (Lead or Gold) after this one, it's similar.

Page **3** of **3**

Posted: **Sun Aug 23, 2009 11:33 pm**

By the way, try problem 802 (Lead or Gold) after this one, it's similar.

Posted: **Tue Aug 25, 2009 1:49 am**

mf wrote: Well, isn't it obvious? A linear combinations of vectors (s2- s1, s3- s1) is zero if and only if the same linear combination of vectors (s1, s2, s3) is a vector with three equal components, namely, all equal to this linear combination of s1's. It's just simple algebra and manipulations of sums. It doesn't deserve to be called a theorem, or even lemma. The proper term for this is "left as an exercise to the reader".

Now, to answer why it's useful. To check that you can make a zero vector, you basically need to check that your collection of vectors spans an angle of 180 degrees or more, i.e. there doesn't exist a hyperplane w*x=0 through the origin ("*" is the dot product of vectors here), such that all the given vectors lie strictly to one side of it.

Necessity: if there exists a vector w such that w*x_i > 0 for all i, then for any non-negative coefficients c(not all zeros), the vector s = \sum c_i x_i satisfies: w*s = w * \sum (c_i x_i) = \sum (c_i w*x_i) = \sum (c_i * <something strictly positive>) > 0, and therefore s can't be a zero vector.

Sufficiency: if hypothesis is satistisfied, then you can always find a linear combination with rational non-negative coefficients (not all zeroes) of given vectors, which will sum to zero vector. I've tried writing a proof for this fact, but it was getting too long for a single post... You better try to draw some pictures and get a feel of it for yourself, instead.

And once you got a combination with rational coefficients, you can always turn it into integers by multiplying all coefficients by LCM of their denominators.

let the 3 packages be (p11,p12,p13 ), (p21,p22,p23) and (p31,p32,p33), now we have to find X,Y,Z such that

X*p11 + Y*p21 + Z*p31 = k

X*p12 + Y*p22 + Z*p32 = k

X*p13 + Y*p23 + Z*p33 = k

and X+Y+Z>0 and k > 0

P=

{

[ p11 p21 p31 ]

[ p12 p22 p32 ]

[ p13 p23 p33 ]

}

Q={ X,Y,Z}

p*Q = k*I

mf wrote:

A linear combinations of vectors (s2- s1[i], s3[i] - s1[i]) is zero if and only if the same linear combination of vectors (s1[i], s2[i], s3[i]) is a vector with three equal components, namely, all equal to this linear combination of s1[i]'s. It's just simple algebra and manipulations of sums. It doesn't deserve to be called a theorem, or even lemma

I don't understand this. Can you please explain , how did you reduce the problem to this and formulated it.

Please tell me what should i learn and read upon.

Posted: **Tue Aug 25, 2009 2:29 am**

tryit1 wrote:mf wrote: A linear combinations of vectors (s2- s1, s3- s1) is zero if and only if the same linear combination of vectors (s1, s2, s3) is a vector with three equal components, namely, all equal to this linear combination of s1's. It's just simple algebra and manipulations of sums. It doesn't deserve to be called a theorem, or even lemma

I don't understand this. Can you please explain , how did you reduce the problem to this and formulated it.

Let c[1], ..., c[n] be any real numbers (or any integer numbers, or any rational numbers - doesn't really matter)

Define these sums:

We want to show that S21 = S31 = 0 if and only if S1 = S2 = S3.

Proof:

First, observe that S21 = S2 - S1 due to distributive law of multiplication and elementary properties of sums:

Analogously, S31 = S3 - S1.

S1 = S2 = S3, iff

0 = S2 - S1 = S3 - S1, (I have subtracted S1 from all sides of equality), iff

S2 - S1 = 0 and S3 - S1 = 0, iff

S21 = 0 and S31 = 0. QED.

Please tell me what should i learn and read upon.

Concrete Mathematics is the right book for learning all this stuff.

Posted: **Tue Aug 25, 2009 9:17 am**

I knew that one. Didn't see it . It is just subtraction of 2 vectors and then making the independent components equal to 0.

For that you take 2 points on 1 side of x-axis and third point on another side of x axis. So if we have a point (x,y), to reduce the x-component , we need to have an angle of 90" between them , say some point (x',y) and to reduce y-component of (x,y) , you need to have angle of (x,y') 90 between them. So a total of 180". If it is more , it can be scaled right ? I'm little confused.

What if we have points like (1,2) ,(3,4) and (5,-6) , their angle is greater than 180" but i don't think we can form linear combination because x components will increase.

Does this mean , given 2D points on a plane, you have to find a subset of them such that they form a closed polygon ?mf wrote:Now, to answer why it's useful. To check that you can make a zero vector, you basically need to check that your collection of vectors spans an angle of 180 degrees or more, i.e. there doesn't exist a hyperplane w*x=0 through the origin ("*" is the dot product of vectors here), such that all the given vectors lie strictly to one side of it.

For that you take 2 points on 1 side of x-axis and third point on another side of x axis. So if we have a point (x,y), to reduce the x-component , we need to have an angle of 90" between them , say some point (x',y) and to reduce y-component of (x,y) , you need to have angle of (x,y') 90 between them. So a total of 180". If it is more , it can be scaled right ? I'm little confused.

What if we have points like (1,2) ,(3,4) and (5,-6) , their angle is greater than 180" but i don't think we can form linear combination because x components will increase.

Posted: **Tue Aug 25, 2009 12:29 pm**

No, that's not what I mean. Sorry for confusing you with complicated words like "hyperplane", I think I over-generalized the things here (to n dimensions)...

What I mean is: if there exists a line through the origin, that is a line of form ax + by = 0, such that all given vectors lie strictly to one its side (a*x* + b*y** > 0 for each vector (x**, y**)), then the answer is "No", you can't get a zero vector.*

In your example, such a line would be the OY axis, because all your points lie to the right of it (their x coordinate is > 0).

What I mean is: if there exists a line through the origin, that is a line of form ax + by = 0, such that all given vectors lie strictly to one its side (a*x

In your example, such a line would be the OY axis, because all your points lie to the right of it (their x coordinate is > 0).

Posted: **Tue Aug 25, 2009 1:01 pm**

I've a O(n) after sorting. If the angle between 2 consecutive ,x{n} and x{n+1} > 180 , then we report false because we can translate the axis to x{n+1} and it will be < 180. Good stuff. Thanks mf for teaching me this.

Generalising this to n - dimensions is a good. All i understand is reduce it to n-1 dimension. See if it can be put in a half plane (n-2 dimenison) through origin. If yes bad.

For a 4 D package, we have 3 D points and 2 D planes through origin. Check for angles in all xy plane,yz plane and xz plane. Is that correct ? Can it be done any faster ?

Generalising this to n - dimensions is a good. All i understand is reduce it to n-1 dimension. See if it can be put in a half plane (n-2 dimenison) through origin. If yes bad.

For a 4 D package, we have 3 D points and 2 D planes through origin. Check for angles in all xy plane,yz plane and xz plane. Is that correct ? Can it be done any faster ?

Posted: **Tue Aug 25, 2009 1:32 pm**

No, I don't know how to extend the angle check to 3D and beyond.

But convex hull check generalizes very nicely. Looks like our hyperplane exists if the origin is outside of the convex hull of given vectors. And there's a simple fact about convex hulls: in 2 dimensions, convex hull of a set of points is the union of all possible triangles with vertices at given points (I've learned this from problem 361, btw!); in 3 dimensions - union of all possible tetrahedrons; and in K dimensions - union of all possible simplices with vertices at given points.

So this gives an O(N^(K+1) K^3) algorithm: for each possible subset of points of size K+1, check if the origin is inside a K-dimensional simplex with corners at these K+1 points. If, for some subset, it is, then yes, we can construct a zero vector. You can check whether a point is inside a K-dimensional simplex using determinants in O(K^3), I think.

But convex hull check generalizes very nicely. Looks like our hyperplane exists if the origin is outside of the convex hull of given vectors. And there's a simple fact about convex hulls: in 2 dimensions, convex hull of a set of points is the union of all possible triangles with vertices at given points (I've learned this from problem 361, btw!); in 3 dimensions - union of all possible tetrahedrons; and in K dimensions - union of all possible simplices with vertices at given points.

So this gives an O(N^(K+1) K^3) algorithm: for each possible subset of points of size K+1, check if the origin is inside a K-dimensional simplex with corners at these K+1 points. If, for some subset, it is, then yes, we can construct a zero vector. You can check whether a point is inside a K-dimensional simplex using determinants in O(K^3), I think.

Posted: **Mon Nov 26, 2012 5:28 am**

I still can't figure out why three vectors with included angles less than PI can be linearly combined to reach (0,0) using rational coefficients. Can anyone give me some more hints?

Thanks!

Thanks!

Posted: **Sat Dec 07, 2013 11:57 pm**

Can someone tell me why the output for:
is
? O_o

Code: Select all

```
3
1 2 3
2 4 6
3 6 9
0
```

Code: Select all

`Yes`

Posted: **Tue Dec 10, 2013 12:32 am**

AC output for that input is No

Posted: **Tue Dec 10, 2013 9:23 pm**

It's obvious but I'm interested why UVa Toolkit said "Yes".

What's more - this error is for every input like:is it because of bad algorithm? [don't need the answer]

What's more - this error is for every input like:

Code: Select all

```
x
a b c
2a 2b 2c
...
na nb nc
0
```

Posted: **Tue Dec 10, 2013 9:43 pm**

UVa Toolkit is wrong.