Green leaf




SportsMedAnalytics - Tensorflow, React native, Flutter, Nodejs, Machine learning


We are building a smartphone app that would enable the following workflow: 2-D video from a standard smartphone camera is recorded and uploaded to a central server, where Google Cloud Platform Vision and Video Intelligence APIs are used to recognize the test subject in the frame, and record 2-D joint position data through a pose estimation algorithm. Mathematical operations can then be performed to derive specific parameters of motion relevant to technique and injury prevention. Derivatives of joint position are used to determine velocity, acceleration, and jerk in the frontal plane. Joint angles, as well as angular velocities and accelerations in the frontal plane can be calculated through relatively straightforward methods. We need to be able to create a database to store this data (I assume as part of the central server), and make it accessible for both viewing within the app (displaying variables over time and variables on one leg compared to the other side) and analysis outside of it. We also need to be able to take the raw data from the video and "clean" it up, or smooth it out as the AI may miscalculate joint centers in some frames of video. Additionally, we need the ability to display some recommended exercise videos to prevent injury.

Years: Any

Location: Anywhere

Requested on: 2021-08-13

Tensorflow, React native, Flutter, Nodejs, Machine learning