When I use C or C++, I use the printf function for rounding:
Code: Select all
double d = 1.00045;
printf("%.3lf", d);
Note that this function does NOT always round numbers correctly in the critical cases (although there are lots of problems where correct rounding gives WA and printf-rounding gives AC). Also, sometimes adding a very small number like 0.00000001 before rounding with printf changes WA into AC. But to me it's unclear when this should or shouldn't be done...
When I use Java, I have my own rounding function. This one works correctly but sometimes I get WA because the judge has used printf for rounding and the result differs in critical cases. So generally, whenever rounding is involved I avoid Java because of this....
Does this help a bit?