The NHSX AI Lab has spoken to technology businesses and care providers throughout the country and found that social care is lagging health in developing and adopting AI. The provider landscape is generally more fragmented, and many organisations still collect data manually. It may also be due to a lack of technical professionals.

Many of the issues raised by these case studies are being addressed by NHSX, such as funding and promoting more simplified data sharing between health and care organisations. Below you will find three case study examples in the UK which demonstrate AI and data-driven healthcare. You can find more information about further case studies and funding on NHSX AI Lab.

Case study 1: Birdie, intelligent monitoring for preventative care

Solving the challenge

A care worker only spends a limited amount of time at a care recipient's home; therefore, they only see them for a short period of time. The service that trialled this project averages three face-to-face encounters every day. Each visit lasts about 30 minutes, for a total of 90 minutes per day. This makes it difficult to gain a good overall picture of people's health and to promote early identification and prevention.

Integrating variables like bowel movements, food consumption, sleep duration, and general mobility would provide a more complete picture and enable for trend analysis to detect behavioural changes. This would allow an automatic alert system to take immediate action, but it may also indicate changes in a person's health that might be handled early on, avoiding hospitalisation.

The Project

Birdie has piloted this idea using motion sensors in care recipients' homes in Cheshire and Bristol with Medacs Healthcare. When abnormal behaviour is observed, the data from these sensors is evaluated using rules-based trend analysis with defined thresholds to alert care managers. In response to these notifications, care managers take action and log it. Care managers are currently reporting the real-life status of the care recipient to produce a more thorough training dataset and fine-tune the alert sensitivity. They also tell Birdie what they did in reaction to an alarm, such communicating to the care recipient, the family or a professional providing medical advice.

This determines whether the abnormal sensor readings were indeed concerning or a false alert.

During a 6-month experiment with 20 care users, warnings detected:

  • One case of undiagnosed dementia.
  • Two cases of urinary tract infections (UTIs) requiring early management.
  • One case whereby it required changes to care plans due to dehydration.
  • One serious sickness necessitating hospital.

While preventing health deterioration has improved short-term care quality, it also reduced long-term care needs and expenses. As well as ensuring that both care recipients and professionals are continuously monitored in a safe and non-intrusive manner. The pilot will now continue and include 40 additional care recipients.

Preliminary lessons learned 

After the pilot project, it became evident that an even larger circle of interaction and communication is required. These are the people who receive care, care professionals and their families.

Medacs Healthcare is still experimenting with new staffing models. The problem is striking a balance between product monitoring and maintaining geographical proximity to the service user.

Visitors and carers triggering motion sensors have caused some concerns with data anomalies. The data interpretation has been synchronised with greater transparency of care schedules and family visits. To avoid data manipulation, caregiver and family visits are eliminated when assessing and interpreting care recipient behaviour from the dataset.

While anecdotal evidence and user comments suggest that intelligent monitoring leads to better outcomes and more proactive care, quantified data is harder to come by. Individualising the argument for early intervention is tough for instance. In the case of a UTI, if not diagnosed earlier, you would never know how terrible it would be. More research on complex evaluative methods is needed.

Case study 2: Painchek, identifying and assessing pain in people with dementia who are unable to self-report

Solving the challenge

It is extremely difficult to determine the level of discomfort experienced by people who are unable to communicate, such as those with mild to advanced dementia. People's suffering can go unnoticed and neglected if they are unable to express it verbally or in writing. This is upsetting for the individual, and it can lead to difficult-to-treat behavioural and psychological issues for caregivers. It's not uncommon for existing paper-based methods for measuring pain to go unused in nursing homes as they require a lot of manual labour and are subjective.

It is estimated that by 2025, more than one million Britons will be living with dementia, with 70% of those in care facilities having dementia or severe memory impairments. The plan is now to prioritise increasing assistance for people living with dementia and increasing health and care providers' awareness of their disease. PainChek was developed as a result of a clinical study conducted at Curtin University in Australia. Clinical studies are available here.

Pain assessment instrument

PainChek is a CE-certified, artificial intelligence-powered pain assessment tool that is offered as a point-of-care app for personal mobile devices. It takes a 3-second video of a person's face and uses AI to detect pain-related facial micro-expressions. This information is then automatically coupled with additional non-facial signs of pain – which are entered on a digital checklist by a caregiver – to generate an overall pain score - no pain, mild, moderate or severe.

Caregivers can decide on suitable pain management strategies and assess their effect over time with information of a resident's pain score. Automatic transmission of scores to a care management system is possible, and an integrated web interface enables a care home to:

  • Analyse the overall pain burden in the home.
  • Evidence the outcomes of pain interventions.
  • Share results with general practitioners (GPs) and families.

 Automated facial recognition technology

This technique is based on the Facial Action Coding System (FACS) and is powered by a machine learning algorithm trained on labelled data. FACS is a face expression taxonomy comprised of 52 Action Unit (AU) codes. Certain pain-related AU codes are more frequently reported in people with dementia than in healthy controls, possibly due to their decreased operant learning (learning through reward and punishment for behaviour) and ability to disguise negative expressions such as pain.

The PainChek tool considers the following AUs that are associated with pain:

  • Brow lowering.
  • Cheek raising.
  • Tightening of eyelids.
  • Wrinkling of nose.
  • Raising of upper lip.
  • Pulling at corner lip.
  • Horizontal mouth stretches.
  • Parting lips
  • Closing eye.

Non-facial indicators of pain

On the PainChek app, a simple digital checklist prompts caregivers to offer binary yes/no responses to the presence of elements classified under five additional non-facial domains established by the American Geriatric Society for assessing pain severity. These include voice, movement, behaviour, activity and body. These observations are paired with analysis of a resident’s facial expressions to digitally calculate the total pain score.

Preliminary lessons learned 

Care professionals are more likely to use PainChek consistently when it is integrated into routine care tasks, such as medication rounds. This enhances pain documentation and staff engagement in evidence-based pain management practises.

Covid-19 presented implementation issues in care facilities, but these were overcome. The induction method has been modified to allow for remote delivery. Additionally, caregivers can conduct the AI-powered face expression assessment up to three metres away from a resident, reducing close contact.

Case study 3: Feebris, using remote monitoring to detect respiratory diseases in their earliest stages

Solving challenge

It can be challenging to ensure that senior residents in care homes receive the healthcare they require on time. In the worst-case scenario, this can result in costly hospitalisation – which is typically associated with inferior health and care outcomes – that could have been prevented through earlier action. Telemedicine is increasingly being viewed as a component of the solution, particularly for improving access to primary care consultations. However, the comorbidities frequently encountered by older adults make it difficult for GPs to make diagnostic conclusions concerning respiratory illnesses in particular utilising solely video interaction.

Feebris has partnered with Care City – an innovation hub for healthy ageing and regeneration – and Havering Health – a cooperation of local GP practises – to roll out its remote monitoring technology across East London's care facilities and GP practises. Between February and July 2020, about 1000 residents were assisted by this technology.

The AI-powered solution enables caregivers to collect observations and vital signs directly from residents through a smartphone application. 22 clinical parameters are included in these observations.

A Feebris 'kit' consists of a cell phone, a digital stethoscope, a pulse oximeter, and a blood pressure cuff. Caregivers can proactively triage emergent health risks and communicate concerns to primary care physicians. Additionally, the product features a web dashboard that GPs may use to remotely monitor vital signs and listen to lung sounds to detect and address health risks early.

At certain moments, AI drives technology. Real-time signal processing methods assist in capturing high-quality data that is noise-free and has a strong signal. Additional algorithms then extract important information from signals, such as the respiration rate. Feebris is currently developing further algorithms to personalise detection thresholds and check-up schedules, as well as decision-support tools for diagnosing infections and flare-ups of chronic illnesses.

The frequency of check-ups for inhabitants varies according to their overall health risk profile, but is typically weekly or monthly. The check-up is non-invasive and takes approximately 10-15 minutes at the resident's home.

Residents and their families have expressed a sense of relief that they and their caregivers now have extensive healthcare information at their fingertips and can simply share it with their primary care physician. Staff in social care have been empowered to take a more active role in comprehensive care pathways.

Preliminary lessons learned 

There have been difficulties integrating the product into integrated workflows between care homes and primary care physicians. Feebris has been iteratively working with all users to understand current procedures and explore new ones to provide a seamless transition.


We have merely scratched the surface of the technologies' application possibilities in the fields of AI and social care. Countries and health care organisations worldwide are making strides in leveraging health information technology to enhance outcomes and access, laying the groundwork for the future of health.

Our latest report, Artificial Intelligence and Ageing - Machine learning for human health and longevity, seeks to answer fundamental questions about AI and ageing, describe next tangible measures for adoption, and makes policy recommendations to the UK Government. Finally, it compiles a collection of well picked brief case studies from around the world to:

  • Identify successful applications of AI and ageing in advancing population health goals, with a particular emphasis on accomplishments that would not have been achievable with older technologies.
  • Examine various techniques to resolving common problems and draw applicable lessons.

 Download our free report to learn more: Artificial intelligence and ageing

To share your thoughts or questions, log in to your IET EngX account, and leave your comments below.