Vital sign estimation from passive thermal video

Ming Yang*, Qiong Liu, Thea Turner, Ying Wu

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

27 Scopus citations

Abstract

Conventional wired detection of vital signs limits the use of these important physiological parameters by many applications, such as airport health screening, elder care, and workplace preventive care. In this paper, we explore contact-free heart rate and respiratory rate detection through measuring infrared light modulation emitted near superficial blood vessels or a nasal area respectively. To deal with complications caused by subjects' movements, facial expressions, and partial occlusions of the skin, we propose a novel algorithm based on contour segmentation and tracking, clustering of informative pixels, and dominant frequency component estimation. The proposed method achieves robust subject regions-of-interest alignment and motion compensation in infrared video with low SNR. It relaxes some strong assumptions used in previous work and substantially improves on previously reported performance. Preliminary experiments on heart rate estimation for 20 subjects and respiratory rate estimation for 8 subjects exhibit promising results.

Original languageEnglish (US)
Title of host publication26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR
DOIs
StatePublished - 2008
Event26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR - Anchorage, AK, United States
Duration: Jun 23 2008Jun 28 2008

Publication series

Name26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR

Other

Other26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR
CountryUnited States
CityAnchorage, AK
Period6/23/086/28/08

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Control and Systems Engineering

Fingerprint Dive into the research topics of 'Vital sign estimation from passive thermal video'. Together they form a unique fingerprint.

Cite this