The typical radioisotope calibrator contains an ionization chamber, a high voltage power supply, an electronic amplifier, and a display unit on which one can select the radioisotope to be calibrated. The ionization chamber is cylindrical in shape (see photos below) and is used to measure the total amount of ionization produced by the sample to be calibrated. The ionization chamber contains Argon gas under high pressure, often 20-30 atm) and the hermetically sealed chamber contains two electrodes having an electric potential between them. When the vial or syringe containing the radionuclide is placed into the chamber, the Argon gas is ionized, the ion pairs migrate toward the anode and cathode and an electrical current flows between them. This current is proportional to the activity of the measured radioisotope. The magnitude of this current is usually very small (on the microampere level), even if large amounts of activity are present. A device called an electrometer, designed for quantifying very small electric currents, is used and its output is displayed in either mCi or MBq. Dose calibrator function is based on a number of parameters. Most important are the activity, the energy level of the photons, and whether particulate emitters (e.g., beta particles) are being calibrated. The chamber’s response is different for pure gamma emitters like Tc99m than for a beta/gamma emitter like I131. This means that the dose calibrator requires a different internal setting for each individual radioisotope. It is particularly difficult to calibrate a pure beta emitter such as Y90. The readings are very geometry dependent, aside from other issues such as self-absorption in the sample being measured and X-ray production when the beta particles interact with the lead shielding surrounding the ionization chamber. The use of custom-designed lightweight plastic sample holders and deep-well detectors has virtually eliminated poor results caused by variations in sample positioning in the well.