Publication Date


Document Type


Committee Members

Ping He (Advisor), Doug Petkie (Committee Member), Julie Skipper (Advisor)

Degree Name

Master of Science in Engineering (MSEgr)


Real-time, stand-off sensing of humans to detect emotional state would be valuable in many defense, security and medical scenarios. Using a multimodal sensor platform that incorporates high-resolution visible-wavelength and mid-wave infrared cameras and a millimeter-wave (mmW) radar system, the detection of physiological indicators of psychological stress is tested through laboratory experiments. Our approach focuses on thermal imaging to measure temperature patterns in distinct facial regions representative of underlying hemodynamic patterns. Experiments were designed to: 1) determine the ability of thermal imaging to detect high levels of psychological stress and assess responses to physical versus psychological stressors; 2) evaluate the fidelity of vital signs extracted from thermal imagery and radar signatures; and 3) investigate the stability of thermal imaging under various confounding factors and real-world limitations. To achieve the first objective, registered image and sensor data were collected as subjects (n=32) performed mental and physical tasks. In each image, the face was segmented into 29 non-overlapping segments based on fiducial points automatically output by our facial feature tracker. Image features were defined that facilitated discrimination between psychological and physical stress states. Four classifiers (artificial neural network, naive Bayes, linear discriminant analysis, and support vector machine) were trained and tested, using a down-selected set of salient features, to evaluate efficacy under several classification paradigms. The method proved very successful. Based on estimated physiological ground truth, we were 100% accurate in classifying subjects with high mental stress, and nearly 99% accurate in the classification of mental versus physical stress. The performance of two non-contact techniques to detect respiration and heart rate were evaluated: chest displacement extracted from the mmW radar signal and temperature fluctuations at the nose tip and regions near superficial arteries, extracted from the MWIR imagery, to detect respiration and heart rates, respectively. Results of the two techniques at stand-off distances of approximately six feet were similar. Estimated respiration rates from the MWIR imagery were accurate within one breath per minute for 72% of the samples and within two breaths per minute for 87% of the samples (67% and 76%, respectively, from the mmW radar) during baseline. Estimated heart rates from the MWIR imagery were within two beats per minute for 27% of the samples and within six beats per minute for 53% of the samples (34% and 49%, respectively, from the mmW radar). The stability of the human thermal signatures was tested in additional experiments to investigate the effects of subject-to-imager distance, subject pose, normal day-to-day variability, facial muscle activation and topical skin products. Quantitative results for the effects of each of these factors on thermal signatures are provided. Autonomous face tracking was found to be successful when the subject-to-imager distance is 15 feet or less and pose angle is 30° or less. The perioptic region of the face exhibited the most stable thermal signatures, and the nose region was least stable. The confounding factors of local, forceful muscle activation and one skin product (liquid makeup) were determined to have a significant impact on measured values, each raising the skin temperature of specific facial regions up to 1.5°C. By characterizing the sensor suite under challenging conditions, its use for assessing human state in operationally feasible settings has been validated.

Page Count


Department or Program

Department of Biomedical, Industrial & Human Factors Engineering

Year Degree Awarded


Creative Commons License

Creative Commons Attribution-Noncommercial-Share Alike 3.0 License
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 License.