You are viewing a single comment's thread from:
RE: Numerical Precision - What Level of Accuracy is 'Good Enough'
I didn't know that computers don't store the entire number when represented as a binary. I guess it never came up during Informatics at school.
I wonder if people have figured out a general formula to use to figure out what precision is required from your measuring and computational devices (computer hardware and software in our case) for a given problem you are trying to solve.