MediaPipe Three.js Real-Time 3D Anatomy Visualization | Nick Bisesi posted on the topic | LinkedIn

Built a real-time skeletal visualization using MediaPipe and Three.js. Created for K-12 anatomy education, but it runs entirely in the browser on any device. Been heads down building immersive experiences for a while, so this is one of many! Would love to hear what you think and what other applications people see for this! #WebXR #ThreeJS #EdTech | 16 comments on LinkedIn

LinkedIn
Another neuro tool you never asked for. A #brain #Tractography visualizer in #Threejs animated by #mediapipe. The tractography is not real, those are just drawn lines.
Demo and source code: https://www.alessandrocrimi.com/ar/brain-tractography.html
Hand Tracking with MediaPipe (Task API)

Real-time hand tracking using the MediaPipe Task API and a TensorFlow Lite model.
The 21 hand landmark points are detected live and displayed as a skeleton. I used my old PlayStation 2 EyeToy camera with a resolution of 640×480 px.

Such systems can be used for gesture control, motion capturing, VR/AR interaction, touch-free interfaces, robotics interfaces, or even for computer games and creative projects.

Similar techniques can be used to implement other forms of computer vision, such as face or eye tracking, by using the corresponding model instead of the hand model.

Video workflow:

- Recorded with OBS
- Edited in Kdenlive
- Transcoded with VAAPI (H.264)

Everything runs on Linux + Python (FOSS), so anyone can set this up.

Background music: Kenke - Counting Stars (Rock Version) [Nightcore] (https://www.youtube.com/watch?v=y8OwQo225cI)

#ComputerVision #MediaPipe #MachineLearning #HandTracking #Python #Linux #OpenSource #RetroTech #EyeToy

AA (@measure_plan)

'fruit xylophone'라는 컴퓨터 비전 기반 음악 실험을 소개하는 트윗으로, mediapipe 손 추적과 Roboflow의 RF-DETR 객체 검출을 결합해 브라우저에서 실시간으로 동작하며 transformers.js를 사용해 구현했다고 설명합니다. 실시간 브라우저 ML과 크리에이티브 인터랙션 사례입니다.

https://x.com/measure_plan/status/2031115015567077416

#computervision #mediapipe #roboflow #transformersjs

AA (@measure_plan) on X

introducing: fruit xylophone a computer vision music experiment built with mediapipe hand tracking and roboflow RF-DETR object detection (running in realtime in the browser using transformers js)

X (formerly Twitter)

I wanted to create website - sign language dictionary.

I have around 3k clips (up to 7 s) with many signs and wanted to generate interactive (rotatable, slowed down or speed up, reversable) animations to publish on website.

At the moment I plan to use MediaPipe Holistic which would generate .json for posture, hands and face movement. Next I want to use RDM, React and Three.js to show model on webpage.

Is there better or more optimal approach to this?

#SignLanguage #mediapipe

AA (@measure_plan)

가우시안 스플랫(gaussian splats)을 탐색하는 인터페이스 개인 프로젝트 소개: World Labs의 'marble'로 9개의 세계를 생성하고, 각 세계를 기하학적·추상적·광활함·친밀성 등 의미론적 특성 기반 그래프로 시각화했으며, MediaPipe 컴퓨터 비전을 이용해 얼굴 추적 카메라 컨트롤을 추가한 구현입니다.

https://x.com/measure_plan/status/2029229021070774433

#mediapipe #computervision #visualization #gaussiansplats #3d

AA (@measure_plan) on X

i made an interface for exploring gaussian splats - 9 worlds created with World Labs marble - visualized in a graph based on semantic understanding of each world (geometric vs abstract, vast vs intimate) - added mediapipe computer vision for face-tracked camera controls and

X (formerly Twitter)

MediaPipeのランドマーク検出でマスクをした人の顔がどのように判定されるか調べてみた
https://qiita.com/ssc-karasawa/items/70991a5ac68ad1f6dc61?utm_campaign=popular_items&utm_medium=feed&utm_source=popular_items

#qiita #Python #landmark #MediaPipe

MediaPipeのランドマーク検出でマスクをした人の顔がどのように判定されるか調べてみた - Qiita

はじめに 前回の記事PythonでMediaPipeを使って顔のランドマーク検出をやってみたでMediaPipeを使ってみたのですが、マスクをしている人がどのような判定をされるか気になったので調べてみました。 マスクをした画像で分析 マスクをした画像とマスクをしていない...

Qiita

AA (@measure_plan)

컴퓨터 비전 실험 공유: 본인이 NPC가 되는 실험을 공개하며 Roboflow의 rf-detr 세그멘테이션, Mediapipe, 그리고 몇 줄의 JavaScript를 사용했다고 설명합니다(비전 기반 실시간/세그멘테이션 응용 사례).

https://x.com/measure_plan/status/2024227295150362739

#computervision #roboflow #mediapipe #rfdetr #javascript

AA (@measure_plan) on X

i turned myself into an NPC [a computer vision experiment using roboflow rf-detr segmentation, mediapipe, and a few lines of javascript]

X (formerly Twitter)

AA (@measure_plan)

손동작을 드럼 루프로 변환해 피아노 연주 중에도 백킹 트랙을 만들어주는 브라우저 기반 드럼 머신을 개발했다는 발표. 웹캠만으로 동작하며 three.js, MediaPipe 컴퓨터 비전, Kimi_Moonshot K2.5 코드 등을 결합해 실시간 제스처→비트 생성이 가능하도록 구현한 창작용 AI·웹앱 사례.

https://x.com/measure_plan/status/2021319009275093281

#mediapipe #threejs #kimi_moonshot #musicai

AA (@measure_plan) on X

i made a drum machine that turns hand gestures into drum loops and helps me create backing tracks while i'm playing piano it runs in the browser using my webcam; no special software or hardware created with @threejs, mediapipe computer vision, @Kimi_Moonshot k2.5 code, and

X (formerly Twitter)

AA (@measure_plan)

threejs, mediapipe, quaternius 모델을 활용해 3D 'wiggle' 물리(물리 시뮬레이션) 연구를 진행 중이며 일부 자바스크립트도 사용하고 있다고 알린 트윗입니다. 개발·연구 맥락에서 웹기반 3D 프레임워크와 핸드/포즈 추적 및 3D 모델 리소스를 결합한 실험을 시사합니다.

https://x.com/measure_plan/status/2020633546239725950

#threejs #mediapipe #quaternius #3d #javascript

AA (@measure_plan) on X

conducting research on 3d wiggle physics with threejs, mediapipe, quaternius models, and a bit of javascript

X (formerly Twitter)