What's not entirely clear to me is why the AC output for input "1" is "1". Shouldn't it be "2"?

**Update:**

**juniorAw**, answered this question on uHunt and I'd like to share what I learned. Recall, from the problem statement that

Code: Select all

`x(i+1) = the number of digits in the decimal representation of xi `

Code: Select all

`x1 = x(0+1) = the number of digits in the decimal representation of x0 = 1. So, x0 = x1.`

Assume x0 = 2. Now,

Code: Select all

```
x1 = x(0+1) = the number of digits in the decimal representation of x0 = 1. So, x0 != x1.
x2 = x(1+1) = the number of digits in the decimal representation of x1 = 1. So, x1 = x2.
```

Code: Select all

```
1
2
3
4
5
6
7
8
9
43
674
8394394390224254424242342324
888888888888888888888888888888232323289238923892389238923892389238923892
999999999999999999999999999999999999999999999992392392392392392923923923923923923923923923923923923923923333333333333333333333
9999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999943535353535935999999999999999999999999999999999999999999993454353535354353593999
END
```

Code: Select all

```
1
2
2
2
2
2
2
2
2
3
3
4
4
4
4
```