Summary: Researchers have developed an AI tool called FaceAge, which uses facial images to estimate biological age and predict survival of cancer patients. A study of more than 6,000 patients found that cancer patients’ FaceAge was about five years older than their historical age, and that a higher FaceAge was associated with shorter survival.
The tool outperformed clinicians in patients with short-term life expectancies, especially when integrated into their decision-making. These results suggest that facial features can be powerful, non-invasive biomarkers of aging and disease, opening up new possibilities for precision medicine.
Important facts:
- FaceAge AI: Estimate biological age and survival rate using facial images.
- Perspectives on cancer: Cancer patients looked about 5 years older than their actual age.
- Clinical reinforcement: FaceAge improved doctors’ predictions of life expectancy in palliative care.
Source: Mass General
The eyes may be the windows to the soul, but a person’s biological age can also be revealed by their facial features.
Researchers at Brigham and Women’s Hospital in Mass. have developed a deep learning algorithm called FaceAge. The algorithm uses a photo of a person’s face to estimate the biological age and survival rate of cancer patients.
They found that cancer patients, on average, had a higher FaceAge than those without cancer and looked about five years older than their historical age.
Older FaceAge predictions were associated with poorer overall survival outcomes in several types of cancer.
They also found that FaceAge outperformed doctors in predicting the short-term survival of patients who received palliative radiotherapy.
Their results have been published in The Lancet Digital Health.
“Artificial intelligence (AI) can be used to estimate a person’s biological age by analyzing facial images, and our study demonstrates that this data can have important clinical applications,” said Hugo Erts, PhD, co-senior and corresponding author, and director of the Artificial Intelligence in Medicine (AIM) Program at Massachusetts General Brigham and Women’s Hospital.
“This work shows that a photograph, just like a simple selfie, contains important information that can help make clinical decisions and create care plans for patients and doctors.
“The age a person appears to have compared to their historical age is very important: people with facial ages that are younger than their historical age do significantly better after cancer treatment.”
When patients come in for a consultation, their appearance can give doctors insight into their overall health and life. This intuitive assessment, combined with the patient’s historical age and many other biological parameters, can help determine the best treatment.
However, doctors, like everyone else, can have biases about a patient’s age. These biases can affect that age. This creates a need for more objective and predictive measurements to inform care decisions.
To this end, researchers at Massachusetts General Hospital (Mass. General Brigham) used deep learning and facial recognition technologies to train FaceAge. The tool was trained using 58,851 images of presumably healthy individuals from public datasets.
The team tested the algorithm in a group of 6,196 cancer patients from two centers, using images taken routinely at the start of radiotherapy treatment.
In a group of cancer patients, older facial age was associated with poorer survival outcomes, especially in individuals who appeared older than 85 years, even after adjusting for age at diagnosis, sex, and cancer type.
Predicting survival at the end of life is difficult, but it has important therapeutic implications in cancer care. The team asked 10 physicians and researchers to predict short-term life expectancy based on 100 images of patients receiving palliative radiotherapy.
Although performance varied widely, doctors’ predictions were generally a little better than a coin toss, even when they were given medical context, such as the patient’s historical age and cancer status.
However, when doctors also received the patient’s FaceAge information, their predictions improved significantly.

More research is needed before this technology can be considered for medical use. The research team is testing the technology to predict disease, overall health and life expectancy.
Follow-up studies include expanding this work to different hospitals, observing patients at different stages of cancer, tracking FaceAge estimates over time, and testing their accuracy against plastic surgery and makeup datasets.
“This opens the door to a new area of research to discover biomarkers from images, and its potential goes far beyond cancer treatment or age prediction,” said Ray Mack, MD, co-senior author and instructor in the AIM program at Mass General Brigham.
As we increasingly consider various chronic diseases as diseases of aging, it is becoming increasingly important to be able to accurately predict a person’s aging process. I hope that we can use this technology as an early detection system in a variety of applications, within a strong legal and ethical framework, to help save lives.
Authorship: Other authors at Mass General Brigham include Dennis Bontempi, Osbert Zalay, Danielle S. Bitterman, Fridolin Haugg, Jack M. Qian, Hannah Roberts, Subha Perni, Vasco Prudente, Suraj Pai, Christi Guthier, Tracy Balboni, Laura Warren, Monica Krishna, and Benjamin H. Kann.
Disclosures: Mass. General Brigham has filed provisional patents for two next-generation vision health algorithms.
Funding: This project received funding from the National Institutes of Health (HA: NIH-USA U24CA194354, NIH-USA U01CA190234, NIH-USA U01CA209414, and NIH-USA R35CA22052; BHK: NIH-USA R35CA22052; BHK: NIH-USA European Union – European Union Research Council – 02016) (HA: 866504).
About this AI, aging, and Cancer research news
Author: Ryan Jaslow
Source: Mass General
Contact: Ryan Jaslow – Mass General
Image: The image is credited to StackZone Neuro
Original Research: Open access.
“FaceAge, a deep learning system to estimate biological age from face photographs to improve prognostication: a model development and validation study” by Hugo Aerts et al. Lancet Digital Health
Abstract
FaceAge, a deep learning system for estimating biological age from facial images to improve diagnosis: A model development and validation study
Background
Because people age at different rates, appearance can provide more reliable information about biological age and physical health than chronological age. However, in medicine, appearance is incorporated into medical assessments in a subjective and non-standardized manner.
In this study, our goal was to develop and validate FaceAge, a deep learning system for estimating biological age based on inexpensive and readily available facial images.
Methods
FaceAge was trained using data from 58,851 presumably healthy individuals aged 60 years and older: 56,304 individuals from the IMDb – Wiki dataset (training) and 2,547 from the UTKFace dataset (initial validation).
Clinical efficacy was assessed using data from 6,196 cancer patients from two institutions in the Netherlands and the United States: the MAASTRO, Harvard Thoracic, and Harvard Palliative cohorts. FaceAge estimates in these cancer cohorts were compared with 535 non-cancer controls.
To assess the prognostic significance of FaceAge, we performed a Kaplan-Meier survival analysis and a Cox model, adjusting for several clinical covariates. We also evaluated the performance of FaceAge in patients with metastatic cancer who receive palliative care at the end of life, and incorporated it into clinical prediction models.
To assess whether FaceAge has the potential to be a biomarker of molecular aging, we performed a gene-based analysis to evaluate its association with senescence genes.
Recommendations
FaceAge showed significant independent prognostic performance across different cancer types and stages.
Older age of presentation was associated with worse overall survival (after adjusting for decade-specific covariates, hazard ratio [HR] 1.151, p = 0.013 n = 4906; 1.148, p = 0.011 in a breast cohort, p = 0.011; 1.77, p = 0.021 in a non-breast cohort, n = 717).
We found that cancer patients appeared older on average compared to their historical age (i.e., an increase of 4.79 years compared to the reference group without cancer, p < 0.0001).
We found that FaceAge can improve clinicians’ survival predictions for patients with incurable cancer undergoing palliative care (curve-wise difference from 0.74 [95% CI 0.70-0.78] to 0.8 [0.76-0.83]; p < 0.0001). This illustrates the clinical utility of the algorithm in supporting end-of-life decision-making.
FaceAge was also linked to the molecular mechanisms of aging through genetic analysis, while age was not.
Interpretation
Our results suggest that a deep learning model can estimate biological age from facial images and thus improve the survival rate of cancer patients.
Further research, including validation in larger cohorts, is needed to validate these findings in cancer patients and to determine whether these findings also apply to patients with other diseases.
After further testing and validation, approaches like FaceAge could be used to translate a patient’s visual appearance into an objective, quantitative, and clinically valuable measurement.
Funding
US National Institutes of Health and the European Union European Research Council.

