I want to create a 3d multiplayer game where the faces of people in the game can be controlled by a face tracking software on a pc ( via webcam/smartphone etc). But I want it to be super realistic like the characters you have created using ziva dynamics. The voice and facial expressions should be in sync in the live game to make it super realistic.
I have a few questions.
- Is the above possible?
- How do you currently scan in super realistic faces into ziva dynaimics. Can this be done using a iphone with lidar or do you need special equipment and software.
- How long does it take to create a super realistic face from source images/scans etc.
thanks
Meg