Hacker News new | past | comments | ask | show | jobs | submit login

> I learned that COBOL has the most accurate arbitrary precision math, surpassing modern languages.

Maybe that was true 20 years ago.




It was not, but for the most part languages don't default to arbitrary precision (i.e., sinple mathematical operators with decimal literals typically gets you binary floating point, not arbitrary precision decimal), even if they have arbitrary precision available at the language or standard library level.


Guess I was wrong. It’s fixed point, and also what you said — COBOL defaults to it. It still makes it easier to write software that need to do a lot of calculations like that.

https://medium.com/the-technical-archaeologist/is-cobol-hold...


Yeah so that article also mentions something that I'd like to highlight:

> Overall we assume that the reason so much of civil society runs on COBOL is a combination of inertia and shortsightedness.

Well, there's more to it. Unfortunately, the better a programming language is, the easier it is to understand, change and also replace code.

And in reverse, a bad language is therefore harder to replace and more likely to stay.

Mind that I'm not saying that cobol is bad or was a bad choice. This is really a general thing, but I believe it applies to cobol at the current time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: