To clarify: The TLDs, both CaF2 and LiF2 (the Harshaw units are Ti doped, BTW) are read by units that clip low energies, with the emphasis on high energy gamma ray exposure only. The reader software (as of two years ago) for the Harshaw units only reports dose from one of the four chips: the one calibrated for deep dose. The glow curve is clipped to the high energy dose, ignoring lower energy exposure, some of which is whole body penetrating. Special processing parameters allow access to the raw data and other chips' recorded dose for neutron exposure and extremity monitoring.
The beta/gamma survey meters in use by the Navy, either the MFR or the AN/PDR-27 or 66, are all calibrated for high energy (Co-60) energies. They are exclusively, as mentioned before, G-M tube based.
Does the combination of these limitations mean that Naval exposure is under-reported with respect to lower energies from other radioisotopes? Co-60 is only one of the isotopes present, and newer plants are rumored to be free of Co containing components. Should Navy radiation monitoring equipment be recalibrated to lower energies in the interest of more accurate dose reporting?