Page **1** of **1**

### 10606 - Opening Doors

Posted: **Fri Jan 23, 2004 6:44 pm**

by **abishek**

I used two different algorithms to get the answer.

1. find the sqrt (n) and the multiply it again with itself

2. take two numbers a and b, such that a * a < n < b * b and do a binary search in

[a, b] for an interger m such that m * m <= n and (m + 1)*(m +1) > n.

i still get TLE.

is there a better method?

abi

Posted: **Fri Jan 23, 2004 9:21 pm**

by **Per**

I don't think there is a better method than squaring floor of the square root (though I would love to see someone prove me wrong on this).

The only problem is that your sqrt routine has to be fast enough. The optimisation that did the trick for me, giving a vast speed increase, was to begin by approximating the square root with a double.

A *very* rough estimate is that you need to be able to solve around 100-150 maximum size test cases per second (on the judge's computer)

Posted: **Sun Jan 25, 2004 7:35 pm**

by **alex[LSD]**

Well there is a different method, I am not trying to say that it is better, but I got AC using it.

If the square is X digits long then the root has to be (X+1)/2 digits long or less. What I did was : i went from the left to right, finding digits in every position of the root.

Consider B is the next Digit you are trying to find, and A is the current approximation of the root.

(A+B)^2 = A^2 + 2AB + B^2;

In this formula A^2 we already know from previous iterations, 2AB is a multiplication "short by long" (and then you add zeros to the right) and B^2 is a square of a single digit.

To fasten things up (I got TLE before getting AC in 4.2s)

1. Use binary search to find digit B

2. Dont run through the whole number when you dont need it in Multiplication.

3. Use 10000-Base System.

Hope this Helps somebody...

Posted: **Mon Jan 26, 2004 3:38 pm**

by **little joey**

Per wrote:I don't think there is a better method than squaring floor of the square root (though I would love to see someone prove me wrong on this).

I'm not shure what you mean by this, Per. But my guess is that you use a binary search, where you determine the initial boudaries by converting to doubles.

If so, this method is basically O(n^3*ln(n)) (where n is the number of digits in the answer): O(n*ln(n)) iterations in the binary search, doing one multiplication, which is O(n^2), per iteration to determine a square.

My method, like the method alex[LSD] describes, is only O(n^3): Producing a constant number of digits each iteration, it requires O(n) iterations, each also with one multiplication, which is O(n^2).

For such small numbers of n (I use nine decimal digits to the int, so for a 50 decimal square root n is only 6), it takes a darn good algorithm and some darn good bigint functions to overtake the simplicity of a binary search. Spending the best part of last sunday, I was able to bring down the runtime to 0.3 sec, one third of which is spent on I/O.

I am not going to publish my method, because I think it's a spoiler. But if you got AC on this problem and send me a PM, I

*might* send you my source code (C).

-little joey

Posted: **Mon Jan 26, 2004 7:03 pm**

by **Per**

What I meant was: I don't think this problem is easier than taking square roots (i.e. I don't think there is a solution which is more efficient than taking the square root and then squaring it).

I'm not sure I understand yours or Alex's method completely, but it sounds like you solve the problem by finding the square root, so I don't think I am proven wrong yet.

Actually, I didn't solve the problem by binary search, but by Newton's method applied to finding square roots. I think the complexity should be better than O(n^3) (I will look it up when I'm not as tired as I am at the moment), though of course, it gets pretty bad constant factors as it makes use of division.

Posted: **Mon Jan 26, 2004 8:55 pm**

by **Per**

Well, one invigorating coffee break and some calculations later I feel pretty certain that the Newton-Raphson solution needs no more than O(log n) iterations, thus making the time complexity O(n^2 log n) assuming an O(n^2) division algorithm. But as you said, using a base 10^9-representation, n is quite small in this problem, so constant factors become important.

Just out of sheer curiosity, I wrote a 15-line solution using GMP, and it was ~12.5 times faster than my solution compiled with optimisation (which in turn is ~5 times faster than compiled without optimisation). The square root algorithm[1] used by GMP appearantly has the same complexity as Karatsuba's algorithm (and with nice constants).

[1]: Paul Zimmermann, "Karatsuba Square Root", INRIA Research Report 3805, November 1999, `

http://www.inria.fr/RRRT/RR-3805.html'

Posted: **Mon Jan 26, 2004 11:52 pm**

by **little joey**

Now I see what you mean. I can't imagine a better method either.

Although I'm neither a mathematician nor a computer scientist and understand only a small part of it, I found the article you referred to quite interesting. I can hardly understand big-oh notation for complexity analisys, let alone big-em or big-kay

By re-looking, re-thinking and re-calculating I found out that my method _is_ basically Newton-Raphson, although I never recognised it as such... I calculate a correction to an estimate of sqrt(x) using doubles instead of bignums to avoid dividing bignums. This technicaly limits me to about 10 decimals per iteration, making it O(n^3), although it could be O(n^2*ln(n)), in priciple, if I did bignum division.

Posted: **Wed Jan 28, 2004 5:40 pm**

by **little joey**

OK, this is realy the last thing I'm going to say about this...

Consider this function in pseudo code:

[c]type_x closest_square(type_x A){

find i such that 2^(2*i) <= A < 2^(2*i+2)

if (2^(2*i) == A) return 2^(2*i);

x = 2^i;

xx = 2^(2*i);

while (i-- >= 0){

a = 2^i;

aa = 2^(2*i);

t = x + a;

tt = xx + 2*x*a + aa;

if (tt == A) return tt;

if (tt < A){

x = t;

xx = tt;

}

}

return xx;

}[/c]

It finds the required square without squaring the square root, in the sense that Per ment (Although it finally _knows_ the square root, but only as an itermediate value).

It is O(n^2) if you make type_x a binary type (a binary number with 300+ bits), because then you only need operators that are linear in the number of bits (assignment, comparison, addition and shift); 2^i can be rewritten as 1<<i, and 2*x*a can be rewitten as x<<(i+1).

My current implementation of this algorithm gives AC in 0.85 sec, much slower than my previous program. This is mainly due to the 300+ iterations it requires for a 100 digit decimal number. But my suspicion is that it can be made an order of magnitude faster if the operators could be worked out in assembly (or better C, for that matter).

Posted: **Sun Feb 01, 2004 1:20 pm**

by **alex[LSD]**

### 10606 - Opening Doors

Posted: **Tue Apr 09, 2013 4:09 pm**

by **shipu_a**

TLE plz help.........

Code: Select all

```
import java.math.BigInteger;
import java.util.Scanner;
class SquareRoot
{
BigInteger bigsqrt(BigInteger in)
{
BigInteger x,y=in,div=new BigInteger("2");
do
{
x=y;
y=x.add(in.divide(x)).divide(div);
}while(x.compareTo(y)!=0);
return y;
}
}
public class Main
{
public static void main(String[] args)
{
Scanner in=new Scanner(System.in);
SquareRoot inp=new SquareRoot();
while(in.hasNext())
{
BigInteger n=in.nextBigInteger();
if(n.compareTo(BigInteger.ZERO)==0)
break;
BigInteger num=inp.bigsqrt(n);
System.out.println(num.multiply(num));
}
}
}
```

### Re: 10606 - Opening Doors

Posted: **Tue Apr 09, 2013 11:31 pm**

by **brianfry713**

Try using binary search to find the sqrt.