Scientists develop ‘smart pyjamas’ to monitor sleep disorders

Researchers have developed innovative, washable smart pyjamas designed to monitor sleep disorders such as sleep apnoea at home. These pyjamas eliminate the need for sticky patches, bulky equipment, or visits to specialist sleep clinics.

Led by the University of Cambridge, the research team created printed fabric sensors capable of detecting tiny skin movements to monitor breathing, even when the pyjamas are loosely worn around the neck and chest. These sensors, trained with a lightweight AI algorithm, identify six different sleep states with 98.6% accuracy while disregarding regular sleep movements like tossing and turning. The energy-efficient sensors require only a few examples of sleep patterns to differentiate between regular and disordered sleep.

Smart pyjamas offer a potential solution for millions struggling with sleep disorders by enabling easy and accurate monitoring of sleep quality. The study, published in the Proceedings of the National Academy of Sciences (PNAS), highlights their effectiveness.

Poor sleep affects over 60% of adults, leading to productivity losses of 44-54 working days annually and an estimated 1% reduction in global GDP. Sleep disturbances, including mouth breathing, sleep apnoea, and snoring, contribute to serious health conditions such as cardiovascular disease, diabetes, and depression.

Professor Luigi Occhipinti from the Cambridge Graphene Centre, who led the research, emphasized the importance of accurate sleep monitoring. “Current gold standard methods like polysomnography (PSG) are expensive and complex, making them unsuitable for long-term home use.”

While alternative home sleep tests are available, they are often bulky, uncomfortable, or focused on a single condition. Wearable devices, such as smartwatches, can infer sleep quality but lack accuracy in detecting disordered sleep.

The smart pyjamas build upon previous work on a graphene-based smart choker for speech impairments. The team redesigned the sensors for breath analysis during sleep, improving their sensitivity and ensuring comfort. These enhancements allow for accurate detection of different sleep states without requiring the garment to be tightly worn around the neck.

The researchers developed a machine learning model, SleepNet, which analyzes sensor data to identify sleep states, including nasal breathing, mouth breathing, snoring, teeth grinding, central sleep apnoea (CSA), and obstructive sleep apnoea (OSA). SleepNet’s lightweight AI framework allows it to function on portable devices without requiring a connection to external computers or servers.

“We optimized the AI model to achieve the best accuracy with minimal computational cost, enabling us to embed data processors directly in the sensors,” said Occhipinti.

Tested on both healthy individuals and those with sleep apnoea, the smart pyjamas demonstrated 98.6% accuracy in detecting sleep states. A special starching process was used to enhance sensor durability, allowing them to withstand regular machine washing.

The latest version of the smart pyjamas includes wireless data transfer capabilities, securely transmitting sleep data to a smartphone or computer for further analysis.

“Reliable sleep monitoring is essential for preventative healthcare,” Occhipinti noted. “Since this garment can be used at home, it provides users with insights into their sleep that they can discuss with their doctor. Factors such as nasal versus mouth breathing, often overlooked in NHS sleep studies, can indicate potential sleep disorders.”

Researchers aim to adapt these sensors for various healthcare applications, including baby monitoring, and are in discussions with patient groups. Efforts are also underway to enhance sensor durability for long-term use.

The research received support from the EU Graphene Flagship, Haleon, and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI).

Leave a Comment

Your email address will not be published. Required fields are marked *