Prev: Possible easy diagnostic outputting from multiple threads tothe one text frame
Next: JDK 1.6.0_21 released
From: Martin Gregorie on 9 Jul 2010 19:06 On Fri, 09 Jul 2010 16:05:39 -0600, Jim Janney wrote: > > In practice there's a strong tendency toward whatever the programming > language makes convenient. The system I work with was originally > written in RPG. RPG strongly favors fixed point decimal, and that's > what is mostly used in the RPG code and in the data base, for example 9 > digits with 2 decimal places for dollar and cent amounts and 5 digits > with 5 decimal places for interest rates. In Java we have USD and > Percent classes that use BigDecimal internally. > Actually, that owes as much to the prehistoric IBM hardware that RPG was initially designed for - most small S/360s, e.g. 360/30 and smaller (S/3, S36, etc.) had no floating point available and many of them could only do BCD arithmetic. The really small ones used a 4 bit serial adder. Performance was terrible but there was no hardware limit to the number of digits that composed a number. Before that very few, if any, general purpose 2nd generation mainframes had floating point hardware - that was restricted to 'scientific' computers such as the Elliott 503. > Using integers for currency amounts is numerically sound but relies on > the programmer to keep track of the decimal position. Not bad for small > projects but it would make me nervous for larger projects where there's > significant turnover among the programmers. > That is a good reason for holding currency amounts in binary pence/cents rather than fixed decimal dollars/pounds - decimal tracking is only an issue when amounts are being converted to/from strings - tracking the decimal point is largely a non-issue, even in assembler. COBOL solved the problem another way - even when I first used it in 1969 the language had a COPY verb to pull in record definitions, etc. from a source code library. There's no real equivalent in other languages and certainly not in Java - COPY falls somewhere between C's use of #define in header files and a decent macrogenerator. -- martin@ | Martin Gregorie gregorie. | Essex, UK org |
From: Lew on 9 Jul 2010 23:49 Jim Janney wrote: >> I'm with Lew on this one: it's an important subject and I meet too >> many programmers, many with CS degrees, who not only don't understand >> it but don't even seem to know that it exists. >> >> I learned enough numerical analysis to understand how little I know >> about it: many programmers don't seem to know even that. Arne Vajhøj wrote: > Numerical analysis is not a big CS topic. Big enough to represent an entire undergraduate semester in the computer science curriculum when I was at university. Big enough to be the source of questions such as the OP's time and time again. Big enough to spawn dozens, perhaps hundreds of papers since the dawn of computer science, and cause a major redesign in floating-point hardware across the industry a few decades ago. Big enough to have 'strictfp' be a keyword in the Java language. Big enough to make it into popular entertainment media. (The notion of skimming "roundoff error" from financial transactions into a secret account has been the key of fictional crime sprees in movies and novels.) You're right in one sense, Arne. Numerical analysis isn't a big CS topic, it's a *huge* CS topic. -- Lew
From: blmblm on 10 Jul 2010 11:33 In article <570f08fb-f157-4bf4-97c4-cf2439da1b28(a)z8g2000yqz.googlegroups.com>, Lew <lew(a)lewscanon.com> wrote: > Boris Punk wrote: > > long l = 9999999999999L; > > double f = 0.11111111111D; > > double fl = f+l; > > System.out.println(fl); > > > > =9.999999999999111E12 > > > > Where's the rest of the 0.1111111's ? > > > > This is a FAQ. Don't they teach numerical analysis at university any > more? What's with the education system these days anyway? A data point, for what it's worth .... At the school where I teach undergrad CS courses, numerical analysis is still taught, but the courses that focus on it are not required parts of the CS degree programs. I suspect that this is typical, and the reason for it has to do with the ever-increasing number of things that *someone* thinks All! Computer! Scientists! Must! Know! I'm inclined to think that dropping the requirement for a whole course in numerical analysis is not unreasonable, but then some of the basic information (enough to answer the OP's question) should be folded into some other course. *What* other course, though .... I usually mention it any programming course in which it might be relevant, but I'm not sure everyone does. (I'd agree with your later post that numerical analysis is a big CS topic, and one worthy of study, but I'm not sure I'd agree with a claim that it's something all programmers should be experts about.) -- B. L. Massingill ObDisclaimer: I don't speak for my employers; they return the favor.
From: Lew on 10 Jul 2010 11:48 blmblm(a)myrealbox.com wrote: > (I'd agree with your later post that numerical analysis is a big CS > topic, and one worthy of study, but I'm not sure I'd agree with a > claim that it's something all programmers should be experts about.) Huh? I never said that. -- Lew
From: blmblm on 10 Jul 2010 12:00
In article <i1a4nn$odp$1(a)news.albasani.net>, Lew <noone(a)lewscanon.com> wrote: > blmblm(a)myrealbox.com wrote: > > (I'd agree with your later post that numerical analysis is a big CS > > topic, and one worthy of study, but I'm not sure I'd agree with a > > claim that it's something all programmers should be experts about.) > > Huh? I never said that. And where did I say you did .... Sorry if I wasn't clear: I originally wrote "I'm not sure I'd agree that it's something ....", realized that I wasn't sure you'd made such a claim, rewrote, and .... Still didn't communicate clearly! I guess I should have said explicitly that I wasn't saying you were making such a claim, rather than hoping that saying "a claim" rather than "your claim" would convey my intended meaning. "Whatever"? -- B. L. Massingill ObDisclaimer: I don't speak for my employers; they return the favor. |