Okay, Navy CaF2 dosimeters were calibrated and shielded for really high energies, while the "newer" LiF2 dosimeters would handle a much broader band of energies with clipping performed by the reader's software. (ELT who processed dosimetry at PHNSY on limdu, so I know both systems.) Ionization occurs at lower energies than those accurately measured by Navy equipment.
Do commercial dosimeters get clipped at the same high energy as Navy dosimeters? Does the clipping artificially lower the measured dose? If so, are Navy dose values legit?
I was an ELT on a sub and everyones dose was consistent with the surveys and power history.
Commercial dosimeter processors are required to be certified by NVLAP for various energy ranges of radiation, so, there shouldn't be any clipping going on. The NVLAP categories are:
Category I - Accident Photons.
Category II - Protection Level Photons.
Category III - Betas.
Category IV - Photon Mixtures.
Category V - Beta and Photon Mixtures.
Category VI - Neutron and Photon Mixtures.
As a NVLAP assessor and having assessed many of the Navy shipyards when they used the bulb detectors, the dosimeters only detected the high energy photons. The Navy has either transitioned totally, or nearly totally to the Harshaw 4-element coppoer doped TLD. I've assessed Puget Sound, Electric Boat, Pearl Harbor, Newport News, New London and the Naval Dosimetry Center in Bethesda. I'm going back to Puget Sound and Silverdale sometime this year. It was my understabnding that the ships were all going to the Harshaw badge, including the subs. If so, the dose results will be more approrpiate for the various radiation environments.
Quote from: NukeNub on Jan 17, 2009, 02:15
I was an ELT on a sub and everyones dose was consistent with the surveys and power history.
Yeah, who wasn't? Sorry, that was a little glib. What I mean is that many of us were ELT's on many subs and carriers and a couple of cruisers. The point is that the TLD's that we used were somewhat rudimentary compared to those used in civilian applications.
The fact that the doses were "consistent" with radiation levels and power history can be misleading. First, the surveys we did usually showed levels of 0.1 to 0.4mR/hr. Lots of room for error at levels that low. The difference between 0.37 and 0.42 mR/hr might seem small, but it adds up over a year or so.
Second, we surveyed everything with GM tubes - which happen to also under-respond to high energy photons. Basically, one gamma is one click. Of course, we know that a 0.63 MeV gamma ray does not make the same effect in a human body as a 1.33 MeV gamma. Yet, both register the same on the old AN/PDR - 27.
I'm hoping that the guys on my old boat are not listening to Walkman cassette players like I did, and I hope their radiation detection instruments have likewise evolved.
ANSI N13.11-2009 was published and the performance test catehories curently in effect are as follows:
I. Accidents, photons
A. General (B and C, random)
B. 137Cs
C. M150
0.05 to 5 Gy
II. Photons/photon mixtures
A. Generala (E ≥ 20 keV; ⊥ if ≤ 70 keV)
B. High E (137Cs, 60Co; α ≤ 60
o)
C. Medium E1 (E > 70 keV, α ≤ 60o)
D. Plutonium specifica (see Appendix A, Section A2)
0.5 to 50 mSv
III. Betas
A. General (B and C, random)
B. High E (90Sr/90Y)
C. Low E (85Kr)
D. Uranium slab
2.5 to 250 mSv
IV. Photon/betab mixtures
Shallow
3.0 to 300 mSv
Deep
0.5 to 50 mSv
V. Neutron/photon mixturesc
A. General (B and C, random)
B. 252Cf + II
C. 252Cf(D2O) + II
1.5 to 50 mSv
To clarify: The TLDs, both CaF2 and LiF2 (the Harshaw units are Ti doped, BTW) are read by units that clip low energies, with the emphasis on high energy gamma ray exposure only. The reader software (as of two years ago) for the Harshaw units only reports dose from one of the four chips: the one calibrated for deep dose. The glow curve is clipped to the high energy dose, ignoring lower energy exposure, some of which is whole body penetrating. Special processing parameters allow access to the raw data and other chips' recorded dose for neutron exposure and extremity monitoring.
The beta/gamma survey meters in use by the Navy, either the MFR or the AN/PDR-27 or 66, are all calibrated for high energy (Co-60) energies. They are exclusively, as mentioned before, G-M tube based.
Does the combination of these limitations mean that Naval exposure is under-reported with respect to lower energies from other radioisotopes? Co-60 is only one of the isotopes present, and newer plants are rumored to be free of Co containing components. Should Navy radiation monitoring equipment be recalibrated to lower energies in the interest of more accurate dose reporting?
Having assessed many of the shipyards and Naval Dosimetry Center (NDC), I believe that all of the TLDs are MCP, copper doped now. One of the reason the Navy went to the new TLD at NDC and shipyards was that the TLD they were using was only reporting high energy deep dose, yet there are scattered radiations resulting in lower energy photons, not detected nor reported. Most of the Navy facilities are accredited under NVLAP 100650-0.
How about neutron dosimetry. Whats new? There are many problems associated with neutron detection and dosimetry. I have three distinctly different area where the predominant concern is neutron. It looks like ROSPEC and conversion factors applied to a panosonic 809A will be used to assign dose. Any ideas or references?