- Published on
Gait Labs: Human Activity Recognition Using Smartphones
- Authors
- Name
- Ronald Luo, MSc
π± Gait Labs: Human Activity Recognition Using Smartphones
Can our smartphones act as early detectors for fall risk and neurodegenerative diseases like dementia?
π§© Introduction
By 2050, the global population aged 60+ will double.
With this shift comes growing concerns around mobility loss, cognitive decline, and the rising costs of elder care.
Traditional diagnostics rely on clinical visits and specialized equipment. These tools work well, but theyβre expensive, inaccessible to many, and reactive β often catching conditions too late.
I set out to explore a new path:
Can the smartphone in your pocket help predict fall risks and cognitive decline, passively and affordably?
In this post, Iβll break down the journey behind Gait Labs, a full-stack system that uses smartphones to capture human movement and applies deep learning to recognize early patterns of health decline.
β οΈ Problem Statement
Falls and dementia-related conditions often progress silently.
Gait changes happen years before diagnosis, but they are subtle and hard to track without lab equipment.
The opportunity?
- Phones today come equipped with accelerometers, gyroscopes, and magnetometers.
- If harnessed properly, they can turn everyday movement into actionable health insights.
ποΈ Step 1: Building the Data Collection Engine
System Overview
Gait Labs is built on a modern stack:
- Mobile App: React Native + Expo
- Backend: Next.js API routes, Prisma ORM, PostgreSQL, deployed on Vercel
- Security: Two-factor authentication (Twilio)
Participants onboard through a smooth flow:
- Register and consent π
- Calibrate their device βοΈ
- Record motion data in real-time π
Over the course of the project, we collected 10,000+ motion samples from participants using just their smartphones.

Sensor Integration
We captured multi-axis data from:
- Accelerometer: motion intensity
- Gyroscope: angular movement
- Magnetometer: directional context
- (Optional) GPS: location tracking
The challenge? Hardware variability.
No two phones behave the same. Device model, battery level, and even pocket placement impact readings.
Designing for this variability upfront ensures future scalability.

π§ Step 2: Teaching Machines to Understand Movement
With data secured, the next step was to transform raw signals into meaningful predictions.
I experimented with three architectures:
- Feedforward Neural Networks: baseline performance.
- LSTM Networks: effective for time-sequential data.
- GRU Networks: lightweight alternative for faster training.
Model Pipeline

Training involved:
- Pretraining on public datasets to accelerate development.
- Fine-tuning on our custom-collected motion data.
Results

Our models successfully classified activities like:
- πΆ Walking
- πͺ Sitting
- π§ Standing
Challenges like domain shift (differences between training and real-world data) remain, but early results validate our approach.
π§© Challenges and Lessons Learned
"Real life is messy. Build for it."
Building for the wild (not the lab) taught me key lessons:
- Sensor variability requires resilient preprocessing.
- Pocket vs. hand placement changes the signal signature.
- Limited initial dataset made model generalization challenging.
- Noise and domain shift require robust, flexible models.
Despite these, we successfully built a system capable of real-world data ingestion and preliminary activity recognition.
π The Bigger Vision: Towards Passive Health Monitoring
Imagine this future:
- Your phone quietly tracks your movements throughout the day.
- It flags subtle gait changes or instability.
- You or your physician get early alerts, years before clinical symptoms appear.
This project lays the groundwork for scalable, consumer-grade passive health monitoring.
Next steps include:
- Scaling data collection to thousands of users π
- Real-time user feedback and alerts π²
- Multi-sensor integration (e.g., smartwatches) β
- Publication and clinical validation π©Ί
π§ Conclusion: From Prototype to Possibility
Gait Labs is a proof of concept that turns a smartphone into a personal health companion.
Through engineering and research, I built:
- β A real-time motion data collection system
- β Machine learning models for human activity recognition
- β A foundation for future clinical-grade health monitoring apps
"What began as a research question has grown into a functional platform β bridging code, health science, and human impact."

π Explore More
Curious about the full technical details?