Thursday 11 May 2017

Extended Practice: Evaluation

From the start of the module I knew what I wanted to create, it’s an animation, which delves into the fine line between technology and reality. My animation is where a dancer starts to perform but then she gradually changes into a robotic figure but the transformation is revealed by particles emitting from the girl to the robot. This was because I wanted to carry on from my Context of Practice and experiment more with particle simulations. However, I wanted to try something new, something that I also experimented with but in second year, motion capture. I wanted to try motion capture because this would allow me to get realistic movements within the performance.

This meant that I’d have to learn a new type of software so that I could record the performance through motion capture. This I didn’t mind doing, actually I found this exciting, and the software I used was the ‘IPI motion capture’, which uses two Xbox kinects at a 90 degrees angle to each other. This process wasn’t easy though due to the software not calibrating or the studio I used was an inconvenience. The software did provide me with me final outcome but the data that I got from the software wasn’t tidy so I had to go into Maya and clean up the keyframes, but even doing this wasn’t easy and I was still left with some glitches in the motion capture data. Unfortunately I couldn’t re-record the performance so that I could get the best capture because the dancer that I used got a cruise ship job and had to relocate to Southampton. This was truly annoying but I learnt that for next time I’d need to capture way more performances before the dancer needs to depart.

Carrying on from my Context of Practice module where created particle simulations, the particles I used I this project came into use in a totally different way. For this project I had to learn a new way to emit my particles so that they could emit from the characters skin. Doing this I had to figure out how to emit particles from the characters textures, this seemed simple, but it was not. I had to create multiple animated image sequences that would control the way the particles were emitted. This had me thinking about timing and where the particles would emit from and in what certain way, it was hard work and a lot of trial and error but I did figure it out how using image sequences on a texture to emit particles would work, and you can see how I did it on my blog: http://j-beardsell108053-sp.blogspot.co.uk/2017/04/extended-practice-creating-textures.html

I enjoyed creating the particles because I love seeing how I could manipulate them in how they fell off of the character. I tried using different force fields within Maya to make the particles fall of in dynamic and dramatic ways, this also helped me understand what the effect of the different fields had on the physics of the particles and how they moved around in the space. I found that working with particles and textured image sequences timing is everything because if there are off timings between two sequences it can lead to big gaps between the particles coming off and the character.


There where a few issues that came about, the first being that the data on from the motion capture wasn’t as clean as I’d hoped. The other issue that happened was that I relied too much on render farming to render my animation off so I left my self two week before the deadline to do all the rendering and post production, WORNG IDEA! This was a bad idea because when it came to sending my project off to be rendered at a render farm caller ‘Rebus Farms’ it came back with an error saying that they didn’t support Arnold render, which I have been using for my animation. I quickly then had to fine another render farm that supported Arnold renderer, which I did, Garage Farms, but again I after sending them my project and spending over five hundred pounds I got some frames back and they were wrong. Unfortunately the frames that had come back had my characters doing this weird transparent effect on them that I didn’t know how to fix and the people at the render farm didn’t know how to fix either. This left me with no other choice but to start rendering off my animation at College with only a week and a half left to go.

There are a couple of things that I would do differently if I was to redo this project again, the first one would be to capture the performance in a proper motion capture studio that uses twelve camera instead of just using two. This would give me better data and a cleaner result. I’d also have motion blue on the particles as they fall of the dancer because if I’d of tried to add this on before the render times would have been insane and I would have a finished animation. I’d also spend more time researching into different render farms so that I could save myself a bit of time and money.

To conclude, I have found this whole module fun but stressful. I enjoyed working in new softwares for motion capture and learning new techniques in creating particles, but found it very stressful rendering my animation at the minute. Working with a musician was a great experience as well due to the fact that I didn’t have to think about copywriting laws or finding music that would match my animation, I had a talented musician produce a piece of music written for my animation. There are some segment in the animation that I am proud of and I will be placing them in my showreel but as a whole I feel like I could of done better with the right equipment and software i.e. using a OptiTack system to capture my performance.

      

No comments:

Post a Comment