Wearables, Movement and the Return of the Physical Patient
Digital health has spent years staring at screens. Patients live in bodies.
They walk, sleep, cough, recover, compensate, limp, fatigue, sometimes deteriorate, and often adapt in ways no one writes down. They do most of this away from the clinic. A patient recovering from a knee reconstruction might be seen for ten or fifteen minutes in rooms; the actual recovery is happening at home, on the stairs, in the kitchen, in the middle of the night when sleep won't come. None of that is in the chart.
This is why CTI's early work on biosensors and movement still matters — not as nostalgia, but because it set the rule we still work to.
In our Body Movement Project, we looked at whether wearable sensors could classify shoulder and knee movement using angular-motion sensors and EMG. Bend Labs flexible sensors. Delsys EMG. A protocol of structured shoulder and knee exercises. Cleaning, signal processing, a Pearson-correlation-based KNN classifier, repeated random subsampling for accuracy testing. Not glamorous. Useful.
The project was early, imperfect and exactly the kind of work a serious healthcare technology company should be doing. It showed us both what was possible and where the road got rough.
The promise was clear. Objective joint-movement data could give clinicians a real picture of recovery, range of motion, compliance, and functional progress between visits. The original paper described, quite specifically, the potential for mobile applications to support patient-specific rehabilitation after injury or surgery — moving away from the generic, one-size-fits-all programmes that don't account for the patient's age, condition, or actual functional baseline.
The friction was just as instructive. Sensors were too thick. Tape created motion artefact. Wireless range was limited. Real-time streaming wasn't there yet. Slicing and cleaning data manually didn't scale. Dataset size mattered. Sensor placement mattered enormously. The signal was real, but it was noisy, and it took work to make it clinically meaningful.
That isn't failure. That is R&D.
The lesson I took from that work — and it has shaped everything we've built since — is that the body is noisy but clinically informative. A wearable signal is not automatically medical truth. It needs context, validation, consent, careful interpretation, and a clinician who understands what the data can and cannot say. Anyone selling you a story about wearables replacing clinicians is selling you something else.
This becomes more relevant, not less, as healthcare moves toward prehabilitation, rehabilitation, and remote monitoring. A smartwatch can tell us something. A skin-based sensor can tell us something else. A physiotherapist's hands-on assessment tells us something neither of those can. The patient's own experience completes the picture, and is too often left out of it.
The future isn't wearables versus clinicians. It is signal fusion done well. Activity, sleep, HRV, movement, pain, function, medication, imaging, the operation note itself, and the patient's own report — all coming together in a way that supports decisions rather than burying them in a dashboard.
CTI's path from movement sensors to AI-native clinical infrastructure is not a swerve. It is a return to the physical patient. The data should serve the person, not the dashboard. That principle is going to matter more, not less, as remote and home-based care becomes the default rather than the exception.
We learned that lesson on a knee. We're carrying it everywhere.
CTI is the AI-native parent company behind Regenemm Healthcare.