Skip to content

OpenCap

Posted by Scott Uhlrich Assistant Professor U of Utah | Director Movement Bioengineering Lab | PhD Stanford


OpenCap Monocular beta release — 3D human motion & forces from 1 phone!

Congrats to Selim Gilon & Emily Miller for the software and nice preprint (linked below).

Multi-camera OpenCap helped move us out of the lab but still required multiple phones and a laptop. We’ve run studies with hundreds of participants in gyms, clinics, and at patient conferences, but we still had to be there to collect the data, preventing massive decentralized biomechanics studies. With OpenCap Monocular, participants can now collect 3D biomechanics data on themselves at home in <2 minutes (no laptop, no calibration, no neutral trial…just one iPhone).

Highlights:

  • Global 3D kinematics from a static iPhone computed for free on OpenCap cloud platform
    - Dynamics (ground forces, joint moments, muscle forces) from hybrid physics–ML pipelines
    - Optimizes outputs from computer vision pose models to compute biomechanically and physically realistic kinematics (e.g., OpenSim models, realistic foot-floor interaction)
    - Example session: https://dev.app.opencap.ai/session/28203da1-c73a-42e5-8f66-cb35211b2354

Validation:
- Rotational accuracy (4.8° vs. markered mocap, <1° different from 2-cam OpenCap)
- Translational accuracy (3.4cm vs. markered mocap)
- Kinetics accurate enough for several downstream tasks relevant to age-related changes in muscle strength (knee moment during sit-to-stand) and osteoarthritis (joint loading during walking)

Get started:
- Follow our best practices to get good results. Recording setup matters, you may need to iterate. Get started, Best practices, & Troubleshooting tips on opencap.ai.
- We are in beta mode, we are still working to make it work reliably for as many activities as possible. E.g., it does not work for jumping, treadmill walking, or small pediatric populations at the moment…we’re working on it. Leave feedback on our forum.

Excited for our first major software release from the University of Utah Movement Bioengineering Lab!

This work builds on great work in computer vision and biomechanics (e.g., hashtag#OpenSim, hashtag#OpenCap, and WHAM).

Preprint: https://arxiv.org/abs/2603.24733 Video abstract: https://www.youtube.com/watch?v=CYV7An1_mWA Website & get started: https://utahmobl.github.io/OpenCap-monocular-project-page/ Project page: https://lnkd.in/ghJenTEb
Open-source code: https://github.com/utahmobl/opencap-monocular

Thanks to the Myotonic Dystrophy Foundation for supporting this work!