News
MySense – helping people stay safe for longer at home

Seven years ago, the renowned theoretical physicist and cosmologist Professor Stephen Hawking warned that the creation of artificial intelligence would be “either the best, or the worst thing, ever to happen to humanity.”
He was speaking at the opening of the Leverhulme Centre for the Future of Intelligence, a multi-disciplinary institute within Cambridge University dedicated to studying artificial intelligence – now universally known simply as AI.
Despite fears in some circles around the rise of AI, its development has moved on apace. From medicine to manufacturing, education, science, transportation, customer service, agriculture, entertainment, retail, surveillance, finance, and the military, the technology has quickly become a part of everyday life.
Not surprisingly, it’s also a driving force behind pioneering new ideas and inventions, one of which is helping oversee people living with dementia and other neurological conditions.
MySense is a high-level monitoring system developed in the UK that uses AI algorithmic intelligence to keep a check on vulnerable patients, identify preventative solutions, and prioritise the human touch when it’s needed, cutting down on hospital visits and allowing people to live at home for longer.
The data collected builds a digital portrait of the patient and in some cases can even recognise the warning signals for a potential illness, alerting care givers weeks in advance.
Clinical evaluation at two NHS trusts has shown MySense is already having a positive impact in cutting unplanned hospital admissions, freeing up beds, and saving money.
South Warkwickshire NHS Foundation Trust, for instance, recorded a 46% reduction in unplanned hospital admissions for patients in its area using MySense.

MySense founder and group CPO, Lucie Glenday
Meanwhile, University Hospitals North Midlands NHS Trust, which has used MySense to work with a cohort of high-intensity and frail elderly patients, recorded consistent monthly reductions of between 40%-50% over an eight-month period compared to admissions for the same users before the product was used in their care.
This equated to an average cost reduction per patient of £16,458 per year.
They’re impressive figures. But for many, there may be the thorny issue that whilst MySense offers the chance to get in front of a range of health problems, alongside giving valuable reassurance and peace of mind to users, their families, and caregivers, at its heart it is reliant on AI to build up a picture of the patient.
Eight sensors are installed in specific locations around the home. One is a wearable heart monitor, whilst another is a sleep belt that goes on the bed. But the others are used to detect movement and activities around the house, such as the bathroom, kitchen, and front door.
This sensor technology picks up around 20,000 data points per day per person to create a personalised insight.
Multiple layers of algorithms and AI process that information to build up a picture of an individual’s activities and health. Environmental and contextual attributes are then applied that overlay disease models and can predict health decline and deterioration.
What MySense essentially does is build a picture of what is ‘normal’ for each user to live independently in their own home. Variations in the data from the sensors could then be a warning signal that not all is well.
Self-confessed data geek and MySense founder and group CPO, Lucie Glenday, whose early career included working on the Government Digital Services team which created gov.uk, is the first to admit that she shares some of the current worries being aired about the possible risks our overreliance on AI could pose.
She told Agetech World: “I have very strong feelings myself around the way that AI models are trained, the ethical nature of that training, and the transparency of the decision-making. I think all of those things are really big questions that need to be answered.”
But Ms Glenday, who set up MySense in 2016 to help people with neurological disorders like dementia, Parkinson’s, and motor neurone disease, live better lives, added: “We’re a B-Corp (companies that meet high standards of social and environmental performance, transparency, and accountability), so we take our ethical responsibilities really, really, seriously. This is not something that is in response to all this damaging stuff in the press.
“This is something that we have built into the business that is at the very core. It is really important.”
She explained that because of the complexity of the problem that MySense needs to solve, it runs multiple machine-learning algorithms.
“The reason we use the sensors is because we get those consistent data points coming in day in and day out to be able to build the models off.
“We have to have good data coming in, and ethically acquired data.
“So, we have all of this data, how do we process, understand and analyse it in a way that doesn’t mean that we have this black box and a machine making decisions that it’s not qualified to make?
“We put some guard rails in place. You say that all machine learning has to be supervised for a production environment, so every single decision that is made has to have supervision in it so that we can absolutely go back and audit every decision an AI machine has made.
“We do use some deep learning methodology, but it’s only in research. It is literally looking at some of the medical data out there and understanding those patterns.”
Transparency and audibility are critical, she said, making sure the training is undertaken on ethically acquired data, and that there is proper oversight “to make sure we are not running roughshod over this stuff.”
MySense is a very personal venture for Ms Glenday. Her sister, Olivia, was just 23 when she died from a rare form of MND in December 2006.
Hence her interest in helping people with neurological disorders to live better, independent lives in their own homes.
Ms Glenday believes MySense is unique.
She said: “There is an awful lot out there in the market that is part of the digital switchover that uses sensors around the home and talks about AI. But we have never put ourselves in that box. We have not played in the telecare space. That is not the environment that we have built the product for.
“It is probably going to sound a bit arrogant but yes, I believe we are unique because we have looked at this purely as a data problem.
“I do not believe there is any other organisation out there right now with 352 metrics on an individual on a daily basis that can track and manage someone’s independence and health to the level that we can.
“It may well be that there is a whole cohort of people out there that say ‘Oh well, I don’t need all of that information. I just need to know that somebody is alive and that they are not on the floor.’ And that is absolutely fine. I have to say there are lots of great products out there to solve that problem.
“But I genuinely believe with overstretched health provision across the world, not just here in the UK, more and more focus is going to be about supporting people to live with long-term and chronic conditions at home, and we are going to need to overcome that challenge as a society over the next few years. I believe we are positioned perfectly to take that opportunity on.”
The vast amounts of information collected by MySense allow for better early diagnosis of a range of problems. Ms Glenday said: “A great example of this is UTIs (urinary tract infection). For the most part, people are picking them up between a week to two weeks in advance of a UTI requiring somebody to be blue-lighted into hospital.
“This is because we are seeing the patterns. We are seeing the behaviours, we are seeing the reduction in mobility, we are seeing the poor sleep patterns, we are seeing the escalation in heart rate, we are seeing a reduction in the number of times going to the kitchen to stay hydrated, we are seeing a decrease or increase in respiratory rate overnight.
“All of these factors show the early indications really early on so that people can intervene and say, ‘OK, we think we’ve got a problem.’ I think that is why we have achieved the results we have.
“It means that people can get ahead and hopefully either put the additional support around or treat at home so there is no requirement to go into hospital.”
MySense is not a medical device, however, although Ms Glenday has high hopes it will get such accreditation in the next year. “All we do at this point in time is deliver what we see. We are not triaging, we are not saying we have picked up an early diagnosis of dementia. What we are saying is, look at all these different factors that would help a clinical team on the ground get to that decision much quicker than they would without that line of sight.
“That is the really important part of what we do; telling that story so that it is easier for the clinical teams to make a clinical decision about future care for this individual.”
Ms Glenday’s hope is that in the near future, enough behaviour information will have been gathered for MySense to begin to fundamentally understand what is going to happen and build care tailored better to each patient.
“We are really excited for the next phase as we start to build our predictive models around specific diseases.
“So, we are going to be looking at Parkinson’s and diabetes and of course, a cause close to my heart, MND, and start building our predictive models against those specific conditions.
“Hopefully, that will mean that we get to work with those cohorts.
“I would love to be able to go and sit down with someone who has recently been diagnosed with Parkinson’s and say ‘We are going to help you manage this.’
“That would just be an extraordinary thing for us as a business to be able to achieve. That is what is making me really excited.”
News
Mole rat gene extends mouse lifespan
News
AI can predict Alzheimer’s with almost 93% accuracy, researchers say

Alzheimer’s AI can predict the disease with nearly 93 per cent accuracy using more than 800 brain scans, researchers say.
The system identified anatomical changes in the brain linked to the onset of the most common form of dementia, a condition that gradually damages memory and thinking.
The findings build on years of research suggesting AI could help spot early Alzheimer’s risk, predict disease and identify patients whose condition has not yet been diagnosed.
Benjamin Nephew, an assistant research professor at the Worcester Polytechnic Institute in Massachusetts, said: “Early diagnosis of Alzheimer’s disease can be difficult because symptoms can be mistaken for normal ageing.
“We found that machine-learning technologies, however, can analyse large amounts of data from scans to identify subtle changes and accurately predict Alzheimer’s disease and related cognitive states.”
The study used MRI scans, a type of detailed brain imaging, from 344 people aged 69 to 84.
The dataset included 281 scans showing normal mental function, 332 with mild cognitive impairment, an early stage of memory and thinking decline, and 202 with Alzheimer’s.
The scans covered 95 of the brain’s nearly 200 distinct regions and used an AI algorithm to predict patients’ health.
Being able to use AI to help diagnose Alzheimer’s earlier could give patients and doctors crucial time to prepare and potentially slow the progression of the disease.
The analysis showed that one of the top predictive factors was brain volume loss, or shrinkage, in the hippocampus, which helps form memories, the amygdala, which processes fear, and the entorhinal cortex, which helps provide a sense of time.
This pattern held across age and sex, with both men and women aged 69 to 76 showing volume loss in the right part of the hippocampus, suggesting it may be an important area for early diagnosis, the researchers noted.
However, the research also found that the way brain regions shrink differs by sex.
In females, volume loss occurred in the brain’s left middle temporal cortex, which is involved in language and visual perception. In males, it was mainly seen in the right entorhinal cortex
The researchers believe this could be linked to changes in sex hormones, including the loss of oestrogen in women and testosterone in men.
These conclusions could help improve methods of diagnosis and treatment going forward, Nephew said.
More than 7.2m Americans are living with Alzheimer’s, according to the Alzheimer’s Association.
More research is being done to reveal other impacting factors.
Nephew said: “The critical challenge in this research is to build a generalisable machine-learning model that captures the difference between healthy brains and brains from people with mild cognitive impairment or Alzheimer’s disease.”
News
Vision implant firm raises US$230m
News2 weeks agoInterview: The US company appealing Europe’s rejection of daily Alzheimer’s pill
News4 weeks agoLongevity startup Biopeak raises US$2.7m
News4 weeks agoBryan Johnson launches US$1m longevity programme
Markets & Industry4 weeks agoAgetech investment & innovation round-up
News2 weeks agoCentenarians’ blood reveals longevity clues
News4 weeks agoInterview: Dr Matthew Bennett on building resilience and a pain-free healthspan
Wellness4 weeks agoRe:Cognition and Cera expand Alzheimer’s clinical trials access
News4 weeks agoFrench biotech raises €12m for osteoarthritis trial





















