Published on

Gait Labs: Human Activity Recognition Using Smartphones

Authors
  • avatar
    Name
    Ronald Luo, MSc
    Twitter

πŸ“± Gait Labs: Human Activity Recognition Using Smartphones

Can our smartphones act as early detectors for fall risk and neurodegenerative diseases like dementia?


🧩 Introduction

By 2050, the global population aged 60+ will double.
With this shift comes growing concerns around mobility loss, cognitive decline, and the rising costs of elder care.

Traditional diagnostics rely on clinical visits and specialized equipment. These tools work well, but they’re expensive, inaccessible to many, and reactive β€” often catching conditions too late.

I set out to explore a new path:

Can the smartphone in your pocket help predict fall risks and cognitive decline, passively and affordably?

In this post, I’ll break down the journey behind Gait Labs, a full-stack system that uses smartphones to capture human movement and applies deep learning to recognize early patterns of health decline.


⚠️ Problem Statement

Falls and dementia-related conditions often progress silently.
Gait changes happen years before diagnosis, but they are subtle and hard to track without lab equipment.

The opportunity?

  • Phones today come equipped with accelerometers, gyroscopes, and magnetometers.
  • If harnessed properly, they can turn everyday movement into actionable health insights.

πŸ—οΈ Step 1: Building the Data Collection Engine

System Overview

Gait Labs is built on a modern stack:

  • Mobile App: React Native + Expo
  • Backend: Next.js API routes, Prisma ORM, PostgreSQL, deployed on Vercel
  • Security: Two-factor authentication (Twilio)

Participants onboard through a smooth flow:

  1. Register and consent πŸ“
  2. Calibrate their device βš™οΈ
  3. Record motion data in real-time πŸ“Š

Over the course of the project, we collected 10,000+ motion samples from participants using just their smartphones.

server client system

Sensor Integration

We captured multi-axis data from:

  • Accelerometer: motion intensity
  • Gyroscope: angular movement
  • Magnetometer: directional context
  • (Optional) GPS: location tracking

The challenge? Hardware variability.
No two phones behave the same. Device model, battery level, and even pocket placement impact readings.
Designing for this variability upfront ensures future scalability.

system infrastructure

🧠 Step 2: Teaching Machines to Understand Movement

With data secured, the next step was to transform raw signals into meaningful predictions.

I experimented with three architectures:

  • Feedforward Neural Networks: baseline performance.
  • LSTM Networks: effective for time-sequential data.
  • GRU Networks: lightweight alternative for faster training.

Model Pipeline

models

Training involved:

  • Pretraining on public datasets to accelerate development.
  • Fine-tuning on our custom-collected motion data.

Results

model results

Our models successfully classified activities like:

  • 🚢 Walking
  • πŸͺ‘ Sitting
  • 🧍 Standing

Challenges like domain shift (differences between training and real-world data) remain, but early results validate our approach.


🧩 Challenges and Lessons Learned

"Real life is messy. Build for it."

Building for the wild (not the lab) taught me key lessons:

  • Sensor variability requires resilient preprocessing.
  • Pocket vs. hand placement changes the signal signature.
  • Limited initial dataset made model generalization challenging.
  • Noise and domain shift require robust, flexible models.

Despite these, we successfully built a system capable of real-world data ingestion and preliminary activity recognition.


🌍 The Bigger Vision: Towards Passive Health Monitoring

Imagine this future:

  • Your phone quietly tracks your movements throughout the day.
  • It flags subtle gait changes or instability.
  • You or your physician get early alerts, years before clinical symptoms appear.

This project lays the groundwork for scalable, consumer-grade passive health monitoring.

Next steps include:

  • Scaling data collection to thousands of users 🌐
  • Real-time user feedback and alerts πŸ“²
  • Multi-sensor integration (e.g., smartwatches) ⌚
  • Publication and clinical validation 🩺

🧭 Conclusion: From Prototype to Possibility

Gait Labs is a proof of concept that turns a smartphone into a personal health companion.

Through engineering and research, I built:

  • βœ… A real-time motion data collection system
  • βœ… Machine learning models for human activity recognition
  • βœ… A foundation for future clinical-grade health monitoring apps

"What began as a research question has grown into a functional platform β€” bridging code, health science, and human impact."

app design

πŸš€ Explore More

Curious about the full technical details?

πŸ—‚οΈ View the full projects and technical materials β†’