Researchers at the University of Washington (UW) and Microsoft Research have developed a system that uses a person's smartphone or computer camera to read pulse and respiration from video of their face.
The system preserves privacy by operating on the device rather than in the cloud, while machine learning (ML) records subtle changes in light reflecting off a person's face, which correlates with changing blood flow.
The researchers trained the system on a dataset of facial videos, and individual pulse and respiration data from standard field instruments; it calculated vital signs using spatial and temporal information from the videos.
Said UW’s Xin Liu, “Every person is different, so this system needs to be able to quickly adapt to each person’s unique physiological signature and separate this from other variations, such as what they look like and what environment they are in.”
From UW News
View Full Article
Abstracts Copyright © 2021 SmithBucklin, Washington, DC, USA
No entries found