News Link • Science

Teaching computers a new way to count could make numbers more accurate

• arclein

Changing the way numbers are stored in computers could improve the accuracy of calculations without needing to increase energy consumption or computing power, which could prove useful for software that needs to quickly switch between very large and small numbers. Numbers can be surprisingly difficult for computers to work with. The simplest are integers ?" a whole number with no decimal point or fraction. As integers grow larger, they require more storage space, which can lead to problems when we attempt to reduce those requirements ?" the infamous millennium bug arose from computer programs storing the year as a two-digit number (99 for 1999), leading to the potential for confusion when the year rolled over to 2000.


ppmsilvercosmetics.com/ERNEST/