Is It Ortho or Neuro? A Wearable Sensor Just Got It Right 96% of the Time.

You know the case. The dog walks in and something is wrong. The owner knows it. You know it. And then comes the question that has humbled even the most experienced clinicians in small animal practice: is this orthopedic or neurological?

It is not always obvious. The gait abnormalities overlap. The history is incomplete. The dog is uncooperative on the exam table in the specific way that only dogs being watched carefully can be. You do your neurological exam. You palpate. You watch them walk again. You make your best call and you order your diagnostics accordingly — knowing that if you called it wrong, you just sent your client down an expensive road in the wrong direction.

A study published this month in Scientific Reports suggests that a wearable inertial sensor combined with a deep learning model can make that call with 96% accuracy.

Let that number sit for a second.

What the Study Actually Did

Researchers from the University of Haifa and the University of Veterinary Medicine Hannover attached inertial measurement unit sensors to dogs and used the movement data collected to train a deep learning model to classify gait into three categories: healthy, orthopedic, and neurological. Using a dataset of 29 dogs, the model achieved 0.96 accuracy in that multiclass classification task and 0.85 accuracy when generalizing to dogs the model had never seen before in a binary healthy versus non-healthy classification.

To be clear about what 0.96 means in practice: the model correctly distinguished between healthy gait, orthopedic gait, and neurological gait at near-perfect accuracy. Not in a controlled treadmill environment with ideal sensor placement and cooperative patients. In a clinical setting, with real dogs, using wearable sensors that move with the animal.

The researchers also explored variations in sensor configurations and assessment protocols, working to optimize both performance and generalizability — the two things that separate an academically interesting result from something that could actually live in a veterinary clinic.

Why Ortho vs. Neuro Is So Hard

The diagnostic challenge at the center of this study is one that every small animal practitioner understands viscerally. Neurological and orthopedic conditions can produce gait abnormalities that look remarkably similar to the naked eye, even to experienced clinicians. Hindlimb ataxia and bilateral hindlimb lameness can overlap in presentation. A dog with intervertebral disc disease can look like a dog with bilateral stifle disease. A dog with degenerative myelopathy can look like a dog with hip dysplasia in its early stages.

The consequences of misclassification are real. Diagnostic workups for orthopedic versus neurological conditions point in fundamentally different directions — different imaging, different specialists, different timelines, and different costs for the client. Getting it wrong early does not just waste money. It delays the right answer at a time when, for some neurological conditions especially, time matters.

Current subjective assessment tools — numerical rating scales, visual analogue scoring, observational gait analysis — are limited by inter-observer variability. Two experienced clinicians watching the same dog walk can and do reach different conclusions. The inertial sensor removes that variability entirely. The data is the data.

What Inertial Sensors Actually Measure

Inertial measurement units, or IMUs, are small wearable sensors that capture acceleration and angular velocity across multiple axes as an animal moves. They are the same basic technology that lives in your smartphone and in fitness wearables, applied here to the specific biomechanical signatures of canine gait.

What makes this study clinically significant is not that IMUs are new — they have been used in canine gait research for years — but that combining IMU data with a deep learning architecture produced classification accuracy that approaches or exceeds what experienced clinicians can achieve subjectively, and does so objectively and reproducibly across patients the model has never encountered before.

The 0.85 accuracy in generalizing to unseen dogs is the number the researchers themselves emphasize as the key marker of real-world applicability. A model that performs beautifully on its training data and collapses when it meets a new patient is not a clinical tool. A model that maintains strong performance on dogs it has never seen is the beginning of one.

What This Means for Your Exam Room

The researchers are measured in their conclusions — appropriately so for a study of 29 dogs that the authors themselves describe as a foundation for further investigation. Larger cohorts are needed. Sensor configurations need refinement. The model needs to meet more dogs before it earns a place in clinical decision-making.

But the trajectory is clear. Objective, wearable, AI-assisted gait analysis for canine orthopedic and neurological differentiation is not a distant hypothetical. It is a research program with a 96% accuracy result behind it and a direct line of sight to clinical application.

For veterinary professionals who have spent careers making this call on instinct, experience, and the particular way a dog carries its left rear leg across a parking lot, that is not a threat. It is a tool. The kind of tool that confirms your clinical suspicion, catches the cases you might have hedged on, and gives you something objective to show a client who needs to understand why you are recommending an MRI instead of radiographs.

The dog walked in and something was wrong. Soon, a sensor on its back might tell you exactly what.

The full study, "Canine gait analysis using inertial sensors and deep learning for orthopedic and neurological disorders," is published in Scientific Reports, 2026.

Previous
Previous

The Antibiotic You Gave May Not Have Been Necessary. This Study Is Building the Science to Prove It

Next
Next

What Cats Are Teaching Us About Cancer