Search the Community
Showing results for tags 'thermometers'.
We recently got a new freezer for the Blood Bank. I am performing my first validation. I am comparing the NIST against the digital display as well as the chart and our own, seperate thermometer. I cannot find any documentation to support how much difference is acceptable between the NIST and the digital read/our thermometer/ etc. Some places I have read say 0.5 degrees which seems very tight to me. Another facility I work for uses 1.0 degree difference which seems more acceptable and able to obtain but I have not found any supporting documentation anywhere! Can someone help me? When validating, what difference is acceptable between the NIST and the other thermometers, be it the probes in the unit or your own internal thermometers?
Hello All, I got two NIST traceable thermometers in today one reading 21.0 C and the other 20.1 C. they are both perfectly acceptable according to manufacture specifications. If I happen to run into an incoming shippment of platelets that reads 20.2 C on the 1st and 19.3 C on the 2nd, will my 2nd thermometer render my platelet shipment unacceptable? When I used to calibrate glass thermometers to the NIST, I would label it to reflect adding or subtracting 1 degree if the reading was off by 1 degree, in order to match the NIST. I was wondering if that can be done with the electronic thermometers; they provide a certificate with the NIST reading and with the instrument reading at the time of calibration. I need one of the thermometers for my incubator and I suspect the same thing can happen there. If the actual temp is 37, but my reading is 36.1. Since my range is + 1, I can see it being a problem if my actual temp is 36.2 per say, and my thermometer is reading 35.3. Technically I'm within range according to the reference thermomether, but out of range with the instrument. Any input on this?