We recently got a new freezer for the Blood Bank. I am performing my first validation. I am comparing the NIST against the digital display as well as the chart and our own, seperate thermometer. I cannot find any documentation to support how much difference is acceptable between the NIST and the digital read/our thermometer/ etc. Some places I have read say 0.5 degrees which seems very tight to me. Another facility I work for uses 1.0 degree difference which seems more acceptable and able to obtain but I have not found any supporting documentation anywhere!
Can someone help me? When validating, what difference is acceptable between the NIST and the other thermometers, be it the probes in the unit or your own internal thermometers?