A project where motion capture data is created based on the AI solution MediaPipe Holistic and applied to a Daz 3D character in Blender
The noted versions are the ones that were used during the project.
- Blender 2.91.2
- Python 3.7/3.8
- mediapipe 0.8.3.1
- opencv-python 4.5.1.48
.blend
file that contains a Daz 3D character with animation actions- exported
.bvh
motion capture files - exported
.fbx
version of the .blend file
load_mp_landmarks.py
, a script to create motion capture data from RBG videos. Is attached tomake_bvh_files.blend
assign_animation_to_avatar.py
, a script to map bone rotations from .bvh files to a Daz 3D character. Is attached toanimate_avatar.blend
that contains the prepared character
Both scripts have an instruction of how to use them at the top of the file. The easiest way to test them is to use the prepared Blender files.
main.py
, a script that analyzes video files with the MediaPipe AI, annotates all video frames and saves them in the folderannotated_images
- German Sign Language video clips to capture the motion data from
- Sign language interpreter: Mathias Schäfer
The 3D character was created in Daz Studio and has been customized with free assets. It was transferred to Blender with the Daz to Blender Bridge.