Prev: Accessor Functions (getter) for C String (Character Array) Members
Next: But adding two extra digits of precision gives -937566.236469986849
From: Rick.Regan on 26 Jun 2010 05:26 I got the same results in C using printf with "%0.10f", which prints 16 significant digits: -937566.2364699869 in Visual C++ and -937566.2364699868 in gcc on Linux. I checked to make sure both were converting to the same double value, since that could explain the difference (If I may plug my own articles here, for related reading: http://www.exploringbinary.com/incorrectly-rounded-conversions-in-visual-c-plus-plus/ and http://www.exploringbinary.com/incorrectly-rounded-conversions-in-gcc-and-glibc/ ). But that wasn't the problem: both convert to -0x1.c9cbc79129818p+19, which in decimal is -937566.236469986848533153533935546875 (this is the nearest representable value, as I verified by hand). It seems like gcc (well actually glibc, since that's where strtod() lives) is right: the digits after the 10th decimal place are 48533153533935546875, which is less than 1/2 ULP, so it rounds down to '8', not up to '9'. Andrew wrote: Yes. I looked up the values in the original input file. 25-Jun-10 Yes. I looked up the values in the original input file. They tally with the output file. What issues are these? Can you give a reference please? Actually, I think the thing to do to ensure that the numbers have their exact string representation preserved is to keep them as strings when the file is read in. This was not done for memory space reasons. Holding each number as a double rather than a string takes only 8 bytes but it takes around twice that for a string. The files being processed are hundreds of megs and it is close to blowing up due to lack of memory. I understand and normally I trust GCC and do not trust VS but this time VS has the correct behaviour. I diff'd the output built via VS against the output built with GCC and selected a few lines that were different. I looked up the VS strings in the original input file and found them. I did not find different strings produced by the GCC run. I can prove there is a problem. Here is a little program: double convertStringToValue(const std::string input) { double value; std::stringstream str(input); str >> value; return value; } std::string formatValue(double value) { std::stringstream str; int prec = std::min<int>(15 - (int)log10(fabs(value)), 15); str << std::fixed << std::setprecision(prec) << value; return str.str(); } int main() { std::string input = "-937566.2364699869"; double value = convertStringToValue(input); std::string converted = formatValue(value); if (input == converted) std::cout << input << " converted ok." << std::endl; else { std::cout << "Conversion failed:" << std::endl << "Input: " << input << std::endl << "Converted: " << converted << std::endl; } return 0; } Here is what I get when I run it (built using GCC 4.4.2):- Conversion failed: Input: -937566.2364699869 Converted: -937566.2364699868 Previous Posts In This Thread: Submitted via EggHeadCafe - Software Developer Portal of Choice WCF Data Services / WCF Behaviors And Server Side Processing http://www.eggheadcafe.com/tutorials/aspnet/7597ebc9-868a-420b-96d0-119d3a501d60/wcf-data-services--wcf-behaviors-and-server-side-pr ocessing.aspx -- [ See http://www.gotw.ca/resources/clcm.htm for info about ] [ comp.lang.c++.moderated. First time posters: Do this! ] |