Estimated read time: 2-3 minutes
This archived news story is available only for your personal, non-commercial use. Information in the story may be outdated or superseded by additional information. Reading or replaying the story in its archived form does not constitute a republication of the story.
Why do you measure the precipitation in .01 (hundredths) instead of just .1 (tenths)? I know how small .01 is and is not enough to hardly get the ground wet. Even if you measure it in hundreths, why not report it in tenths? There is no difference in .02 and .07 that can be detected by a person. Why don't you report in tenths and if it is less than a tenth, report it as a trace?
John C.
**********************************************************
A valid question indeed. As people, we can't really tell if it's rained .02 or .03 but for record keeping, that essential. We report rain totals when there's rain, not just to say 'hey it rained there' but for record keeping too. All of those rain totals get tallied so the National Weather Service has a good idea if we're short or ahead of schedule on rainfall.
It's not just the NWS that records the rain either, we have viewers that watch and like to know specifically how much it rained for their own records, perhaps that person doesn't have a weather sensor at their home but wants to know how much it rained in say, Roy, UT or another town.
If we reported rainfall as not being to its true amount, record keeping would be a nightmare. According to the Glossary at the American Meteorological Society, a trace of rain is actually less than .01 inches. Reporting .04 as a trace would be a true fallacy.
While the .02 or even .08 might not be important to everyone, we still report it for its preciseness. And those small amounts may not be detected by a person for sure as you point out, but your lawn can detect the difference along with other plants outside. By knowing exactly how much water we've received, we're careful not to waste any extra.
Answered by KSL Meteorologist Dina Freedman.