Thomas Hangartner (Advisor), David Short (Committee Member), Julie Skipper (Committee Member)
Master of Science in Engineering (MSEgr)
The OsteoQuant is a second-generation pQCT scanner, which can provide precisedensity assessment of bone. The scanner is being upgraded with an x-ray tube radiation source and a semiconductor CZT detector. This thesis provides solutions for two issues: motion-detector timing and dead-time/beam-hardening corrections. During translation, the motion-control system of the scanner repositions the source and detector by small intervals. However, the detector collects photon counts asynchronously with respect to the motor-timing pulses and reads photon counts (frames) at regular intervals based on its own clock. Since there needs to be correspondence between source/detector location and detector readout, the first part of this project deals with relating the motor- and detector-timing pulses to each other. The goal is to implement a system capable of registering these pulses in microsecond resolution using a common time base. These time stamps can then be used to accurately relate each detector frame to a motor position. The measurement of the timing pulses is achieved by using two 32-bit counters, controlled by a common time base and supervised by a LabVIEW program. This counter system is capable of measuring a signal with a time period of 2.22 ms with a maximum error of ±3 μs. The second part of this project deals with methods to correct the effects of dead time and beam hardening. The dead time is related to the properties of the detector system, whereas beam hardening is a result of the attenuation of the poly-energetic spectrum of the x-ray beam. The aim is to correct the photon-count loss due to dead time to an error level of less than 0.5% of the maximum expected photon counts and the non-linearity of the projection values due to beam hardening to an error level of less than 1% of the expected maximum projection value. The measured photon counts vs. current curves were described using a fourth-degree polynomial and then linearized to their expected counts. This provided corrected photon counts within the set error level. For the beam-hardening correction, the absorber was simulated by aluminum / Plexiglas slabs up to the equivalent thickness of the forearm and the lower part of the leg. The projection-vs.-thickness curves were mathematically modeled using a fifth degree polynomial and a bimodal-energy model and then linearized. Both corrections supplied satisfactory results if applied to data sets measured the same day. However, the measurement of slabs to calculate the correction parameters is tedious. To avoid daily measurement of all the slabs, a simplified approach was developed by applying the primary corrections from one particular date to the data sets collected on other dates followed by a secondary correction based only on a few plates measured on the specific date. This secondary correction, based on a third-degree polynomial, resulted in residuals within the desired error range. The largest of these residuals for the bimodal primary corrections were less than 0.012 projection value units and those for the polynomial primary corrections less than 0.017 projection value units for the 10- slab data set, which simulated the forearm thickness. For the 19-slab data-set, which simulated the leg thickness, the maximum residuals were 0.03 and 0.04 projection value units for the bimodal and polynomial primary corrections, respectively. Thus, the bimodal-energy model performed better than the polynomial model for the beam hardening correction.
Department or Program
Department of Biomedical, Industrial & Human Factors Engineering
Year Degree Awarded
Copyright 2009, all rights reserved. This open access ETD is published by Wright State University and OhioLINK.