Jump to content

gnixon

Members - Bounced Email
  • Posts

    5
  • Joined

  • Last visited

  • Country

    Canada

About gnixon

  • Birthday 03/26/1967

Contact Methods

  • Website URL
    http://www.theratronics.ca

Profile Information

  • Interests
    computer modelling, mathematics, running, racewalking
  • Biography
    Ph.D. (physics), 10 years experience in the medical device and radioisotope fields
  • Location
    Ottawa area
  • Occupation
    Senior Radiation Physicist

gnixon's Achievements

  1. The AABB stipulates that the data from your annual or semi-annual dosimetry reports is to be used as the criterion for validating your process for product irradiation and product release. However, the validation of radiation-sensitive indicators is something that has received little attention and is seldom considered or discussed. The AABB standards simply state that a "method shall be used to indicate that irradiation has occurred with each batch." Given the paucity of information, one cannot fault end-users for taking the validation of these indicators as a mere formaility and use ad-hoc methods to deem teh indicator as validated. Considering the prevalence of these indicators and the associated expenses involved, it does seem rather odd that there is no recommended protocol for validating indicators. One may wonder why AABB bothers to recommend their use and yet provide few, if any, guidelines as to their use and for the proper validation for that intended use. One such use may be in the case of operator failure and/or some instrumentation malfuction. In such a circumstance, the proper dose-response validation of the indicator could prove rather important. Let's say your irradiator faults-out just before the irradiation cycle completed. Would you be able to release the product? No/maybe? Let's say that your annual (gamma) or semi-annual (x-ray) dosimetry report data indicates that the irradiation time given (corrected for source strength/decay) should be sufficient to meet the dose requirements (minimum dose >=15 Gy, dose >=25 Gy at the center, maximum dose<=50Gy). Can the product then be released, all other factors being equal? Yes? What would happen if your indicators failed to indicate the product as irradiated to 15 Gy? Would you then irradiated further (say, in another irradiator) until the indicator changes? Would you reject the product if the indicator failed to change when you ran close to the maximum irradiation time for achieving a maximum dose below 50 Gy? Consider the fact that you may not even know what that maximum allowable irradiation time is because all you have is a dose map for the fully-loaded canister geometry. That time might prove to be less for a partially-filled canister. So, it is my opinion that the validation of indicators, using a well-drafted protocol, is important and that the validation should include dose-response verification to known doses. Sticking a new batch label side-by-side with one from the old batch and observing that both respond similarly just doesn't cut it, especially if the latter was not validated for dose-response. For all that you know, they may both be responding incorrectly.
  2. Dear Mini-Me et al: The primary purpose of the indicator is to provide some form of visual assurance, right on the blood bag, that the blood product has been irradiated. The secondary purpose is to provide a qualitative indication of the degree of radiation exposure. Some indicators provide a semi-quantitative indication of the level of dose: a color matching scheme is used whereby you can visually ascertain whether the absorbed dose lies between, say, 15 Gy and 50 Gy (i.e., RadTag brand indicators). However, indicators are not dosimeters and cannot be used as a substitute for quantitative dosimetrty, nor do they elimninate the need for other process-control mesures such as controled segregation of nonirradiated from irradiated products. When selecting or validating indicators for blood product irradiation, you need to consider several factors. Chief among these are: 1 - suitability and convenience of use: ability to remain attached and withstand the stresses of irradiation as well as their integrity/stabilty w.r.t. the environmental conditions they are subjected to (i.e., extremes in temperature, relative humidity, exposure to ozone, UV, etc.). 2 - response: must be appropriate for the range of dose, dose rate, radiation energy, and environmental conditions experienced by the irradiated product. Once you have selected an indicator (one designed specifically for x-ray based units or one specific to gamma based units), and can ensure that you are only using them under the manufacturer's stated conditions of use, then your validation protocol should include the verification of the indicator's RESPONSE for your particular type of indicator as verified in YOUR irradiator under YOUR environmental conditions. This also applies when changing lots within a given brand or type. The response must be such that it does not indicate that an absorbed dose level of 15 Gy has been achieved when, say, only 10 Gy has been delivered. That is why I proposed the irradiation protocol and dose levels that I did. In order to test the response at accurate dose levels, you will most likely have to ensure that the sample chamber is completely filled with water-equivalent material and that you place the indicators in the centre. These are the conditions under which your dose rate has been measured and for which your timer setting has been set/calibrated to. I agree with your comment that indicators should be treated as reagents - they are. Indicators typically have a limited shelf life and must be properly stored (e.g., refrigerated, kept out of UV/sunlight). Given enough time, most indicators will age to the point where their integrity is compromised and so it is important to respect the manufacturer's expiration labels. Best regards, Grant Nixon, Ph.D., P.Phys.
  3. Dear "Mini Me": I do not like your proposed method because you likely do not know the dose rate for such irradiation geometries. Note that the reference geometry that is used to validate your timer setting is typically one where the canister is completely filled with a water-equivalent material to mimic a blood-filled canister (i.e., your annual dose mapping uses a polystyrene phantom that fully occupies the canister). The region with the most uniform dose rate is typically at the geometric centre of such a filled canister. This is, therefore, the prefered irradiation geometry (a filled canister) and irradiation location (centre of canister) you should be using for validating your indicators. This would give you some degree of traceability to previous measurements (e.g., past dosimetry reports) and allow for more accurate dosing to prescribed doses. If you do not have a polystyrene phantom and if your annual dosimetry is not being performed in the days ahead (where such a phantom would be available to you), you will have to find a way to completely fill the sample chamber with either expired blood bags or water-filled bags (the smaller the better). Ensure that the canister is completely and uniformly filled leaving no air gaps. Place the indicators at the geometric centre of the sample chamber. Perform separate irradiation cycles on different sets of (say, 3) indicators, targeting, say, the following doses at the centre: 9 Gy, 12 Gy, 15 Gy, 18 Gy, 25 Gy, 50 Gy. I would check for the accuracy of the 15 Gy indicators for changing at the prescribed 15 Gy and higher doses. If the indicators indicate "irradiated" for any of the doses below 15 Gy, I would fail them. If they indicate "irradiated" only at 15 or 18 Gy as well as for ALL the higher doses, I would consider the batch a pass. Anything else would be deemed a fail. Note that your practice should be such that you never target a central dose that would result in a minimum dose below 18 Gy at any rate. You might also consider trying RadTag brand indicators. They differ from simple go/no-go indicators in that they provide a color change that can be color-matched with the reference colors for 15 Gy and even 50 Gy right on the same indicator. Best regards, Grant
  4. The current direction is that the govenment intends to fund a source hardening retrofit for Cs-137 irradiators whereby the sources will be rendered even more inaccessible than they already are. To this end, some beta sites have already been retrofitted in this way. Further, there are efforts underway to manufacture less dispersible physical forms Cs-137 material that would also prove less soluble. The latter efforts will likely take ~5 years to complete. Cesium irradiators have historically proven to be more reliable (and will ALWAYS be more reliable), cost less to maintain (x-ray tubes and power supplies are expensive consumables), and have a higher throughput that x-ray based units. As such, if your site has a critical need for just-in-time irradiated blood, then, given the current state of the art, cesium units continue to be the way to go. Grant
  5. Re. "When our inspector said someone could use a screwdriver to take the irradiator apart and walk off with the source" Your inspector clearly has no concept of how these machines are fabricated, where the sources are located, and of the rather extreme measures that would be required in order to gain access to them. This kind of misinformation speaks directly to their level of competence (or rather, incompetence). They should probably be reported to the state or national authority (as appropriate) because nobody needs this would-be "Chicken Little" running around. Certainly, it would prove a public service if you were to "out" him. I would recommend that the inspector NOT be retrained, however, because it would only serve to make them (marginally) more dangerous. Best regards, Grant
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.