View Single Post
  #4  
Old 10-02-2006, 05:41 AM
rdotsch rdotsch is offline
Member
 
Join Date: Jul 2005
Location: The Netherlands
Posts: 20
Hi!

I'm working with Marc on this project. To rephrase, the problem is as follows:

For an experiment we need to have an avatar which imitates a user's head movements, while speaking. To this end, we create our own avatar.

1) We exported the avatar to Cal3d, which worked fine. We were able to have him imitate our head movements.

2) We also were able to replace the head with a head we made in people maker, so the avatar would be able to speak. The head mesh had a neck already attached to it, the body didn't have a neck. This worked fine.

3) But we got problems when we want to combine the imitation with the talking. People maker made the neck vertices lose the weights we assigned to them in Maya/3ds Max.

4) We tried to solve this by cutting the head and the neck into two seperate meshes, keeping the neck attached to the body, so it'll keep the weights in Cal3d. Then, in people maker we selected the neck vertices before creating the morphs. This didn't work out, because, depending on to which bone we try to clamp the newly created head, either the selected neck vertices are pulled down, but not attached to the body mesh (the neck), or we get some empty space left over between the neck and the head.

5) Rotating the head bone instead of the neck bone will not solve the problem, because the problem is already there before we even rotate anything (the open space between the neck and the head).

What are we doing wrong?

If you want, we can post our cal3d avatar, and the heads we created in people maker.

Also, it would help if we would know what the clamp and the head variables mean with the avatar.face() function. It seems that the clamp variable needs to be a bone, but which bone then (as we already mentioned, we don't use the avatars that come with vizard)?

Thanks in advance for your help,

Ron

Last edited by rdotsch; 10-02-2006 at 06:15 AM.
Reply With Quote