Real-time hand tracking using the MediaPipe Task API and a TensorFlow Lite model.
The 21 hand landmark points are detected live and displayed as a skeleton. I used my old PlayStation 2 EyeToy camera with a resolution of 640ร480 px.
Such systems can be used for gesture control, motion capturing, VR/AR interaction, touch-free interfaces, robotics interfaces, or even for computer games and creative projects.
Similar techniques can be used to implement other forms of computer vision, such as face or eye tracking, by using the corresponding model instead of the hand model.
Video workflow:
- Recorded with OBS
- Edited in Kdenlive
- Transcoded with VAAPI (H.264)
Everything runs on Linux + Python (FOSS), so anyone can set this up.
Background music: Kenke - Counting Stars (Rock Version) [Nightcore] (https://www.youtube.com/watch?v=y8OwQo225cI)
#ComputerVision #MediaPipe #MachineLearning #HandTracking #Python #Linux #OpenSource #RetroTech #EyeToy




