Yeah, I can relate to removing everything before export after having seen how much problems all these constrainst and IK Setups already cause inside of Blender.
I am currently fighting with getting procedural aniamtions running on top of my exported ones from blender, maybe these constraints are blocking all my attempts?
But most probably I just need to spend more time with Unitys animation classes and learn a little bit more how to juggle all the different ways to blend animations together. After reading different topics on the net it seems the possibilities are staggering (and so is the confusion amongst newbie users)...
Anyway, I'll take a break from animations for now. Will return to some other tasks, trying to setup my speedtrees and working on some environmental stuff. I still need to do a ton of other animations (currently the character is just running and idling), as well as find out why my character doesn't respond to procedural attempts from my scripts. But for now I am happy to have come this far.
As promised, here what I have achieved so far, textured and setup with a basic character controller in Unity:
[attachment=32414:run_animation.gif]
Not perfect, but good enough for the moment.
One interesting question came up yesterday when I talked to a friend about my first steps about animating something in Blender. He asked why I didn't let the Blender physics animate the pouches and grenades hanging from the characters belt (did it by hand with bones).
Well, duh, I didn't even think about the Blender physics (engine?).... is it possible to let Blender calculate the position/rotation of a bone with its physic engine, and capture that position/rotation with "visual locRot" as a keyframe?
Setting up the joints and all might take some time, compare to animating everything by hand it might be worth it though