Thursday 11 May 2017

Extended Practice: Evaluation

From the start of the module I knew what I wanted to create, it’s an animation, which delves into the fine line between technology and reality. My animation is where a dancer starts to perform but then she gradually changes into a robotic figure but the transformation is revealed by particles emitting from the girl to the robot. This was because I wanted to carry on from my Context of Practice and experiment more with particle simulations. However, I wanted to try something new, something that I also experimented with but in second year, motion capture. I wanted to try motion capture because this would allow me to get realistic movements within the performance.

This meant that I’d have to learn a new type of software so that I could record the performance through motion capture. This I didn’t mind doing, actually I found this exciting, and the software I used was the ‘IPI motion capture’, which uses two Xbox kinects at a 90 degrees angle to each other. This process wasn’t easy though due to the software not calibrating or the studio I used was an inconvenience. The software did provide me with me final outcome but the data that I got from the software wasn’t tidy so I had to go into Maya and clean up the keyframes, but even doing this wasn’t easy and I was still left with some glitches in the motion capture data. Unfortunately I couldn’t re-record the performance so that I could get the best capture because the dancer that I used got a cruise ship job and had to relocate to Southampton. This was truly annoying but I learnt that for next time I’d need to capture way more performances before the dancer needs to depart.

Carrying on from my Context of Practice module where created particle simulations, the particles I used I this project came into use in a totally different way. For this project I had to learn a new way to emit my particles so that they could emit from the characters skin. Doing this I had to figure out how to emit particles from the characters textures, this seemed simple, but it was not. I had to create multiple animated image sequences that would control the way the particles were emitted. This had me thinking about timing and where the particles would emit from and in what certain way, it was hard work and a lot of trial and error but I did figure it out how using image sequences on a texture to emit particles would work, and you can see how I did it on my blog: http://j-beardsell108053-sp.blogspot.co.uk/2017/04/extended-practice-creating-textures.html

I enjoyed creating the particles because I love seeing how I could manipulate them in how they fell off of the character. I tried using different force fields within Maya to make the particles fall of in dynamic and dramatic ways, this also helped me understand what the effect of the different fields had on the physics of the particles and how they moved around in the space. I found that working with particles and textured image sequences timing is everything because if there are off timings between two sequences it can lead to big gaps between the particles coming off and the character.


There where a few issues that came about, the first being that the data on from the motion capture wasn’t as clean as I’d hoped. The other issue that happened was that I relied too much on render farming to render my animation off so I left my self two week before the deadline to do all the rendering and post production, WORNG IDEA! This was a bad idea because when it came to sending my project off to be rendered at a render farm caller ‘Rebus Farms’ it came back with an error saying that they didn’t support Arnold render, which I have been using for my animation. I quickly then had to fine another render farm that supported Arnold renderer, which I did, Garage Farms, but again I after sending them my project and spending over five hundred pounds I got some frames back and they were wrong. Unfortunately the frames that had come back had my characters doing this weird transparent effect on them that I didn’t know how to fix and the people at the render farm didn’t know how to fix either. This left me with no other choice but to start rendering off my animation at College with only a week and a half left to go.

There are a couple of things that I would do differently if I was to redo this project again, the first one would be to capture the performance in a proper motion capture studio that uses twelve camera instead of just using two. This would give me better data and a cleaner result. I’d also have motion blue on the particles as they fall of the dancer because if I’d of tried to add this on before the render times would have been insane and I would have a finished animation. I’d also spend more time researching into different render farms so that I could save myself a bit of time and money.

To conclude, I have found this whole module fun but stressful. I enjoyed working in new softwares for motion capture and learning new techniques in creating particles, but found it very stressful rendering my animation at the minute. Working with a musician was a great experience as well due to the fact that I didn’t have to think about copywriting laws or finding music that would match my animation, I had a talented musician produce a piece of music written for my animation. There are some segment in the animation that I am proud of and I will be placing them in my showreel but as a whole I feel like I could of done better with the right equipment and software i.e. using a OptiTack system to capture my performance.

      

Extended Practice: Editing animation together / Post Production

Editing together my final animation and compositing it wasn't too hard to do, but I did have to wait until all my renders were done so that I could finish it. The way I had to edit my animation together was that when I rendered out my animation I only rendered out a hundred frames at a time so that meant me having to go through loads of file and importing about 40 odd different folders which had image sequences in so that I could place them on the timeline. This was a bit tedious to do but it made me think more about different shots and what would look better at what point in the animation i.e. a back shot or a front shot.


I did all the credits in After Effects and I use a plug-in called 'Trapcode Particular' for the main title. I did it this way because I wanted the title to represent the animation but I didn't want to go for any crazy typography. I managed to create a title that uses particles using some simple methods of masking and particle emission, this was also to add more visual effects in to the animation and it shows the audience from the start what's going to happen in the animation.


I also did some colour correcting on the animation using the 'Lumetri Colour' where I basically just experimented with the contrast, exposure, shadows, highlights etc. This helped pop and define colours within the animation and make the animation look like a professional film. As well as this I placed another effect onto the composition and added some black bars across the top and bottom of the screen to give it that widescreen / cinema look and feel to it.

Extended Practice: Sound Design

I can't remember if I've mentioned this before but for the sound I got a student from Leeds College of Music to score a track for me so that I'd have some original music and not have to deal with copy-write infringements. Georgie Ward is the girl that I got to help me with the music that would go with my animation, I chose her because of her techno like genre of music she produced and it would fit in well with the aesthetic of my animation.

I found that working with her was very easy and we got along great, there were not arguments over how the music should sound, if I wanted something changing she was happy to apply the changes. There was a time though where we spent at least three weeks just not communicating to each other because we were too busy doing our own thing, but it didn't really effect the music that much so it didn't really matter.

Georgie's work ethic was good, she started of sharing a playlist with me on soundcloud so that I could get a better feel of her work and then I could choose out of the tracks that she gave me which one I thought would suit my animation best. She then did multiple examples of the track that I chose so I could get a better feel for the track and again I picked the better one so that she could develop that one further. This is what she came back with:


After listening to the track and creating the animation I messaged her back to ask if it could be possible to make the track only 2 minutes and 10 seconds long because that's how long my animation is. Fortunately her response was yes and she edited the track down to fit me specifications, but she also added more into the track to give it a but more personality. So this is what I got back from her:



Extended Practice: Press Pack Part 3

The third part of the press pack we have to make a trailer for our animation, so I wanted mine to be dramatic but simple. To do this I made it so that only two or three seconds are shown at a time but with long pauses of blackness inbetween to build up the tension. I have only included the first few snippets of the animation so that I don't show too much off and give anything away.

I edited this together using Premiere Pro and made the short clips fade in and out of each other, I also placed all the credits at the end and none at the beginning so that the name of the animation will be revealed at the end of the trailer. I feel like there are some good bit that I could of placed in to the trailer but I feel like if I did that I would of given too much of the animation away and spoilt the whole experience of the full film.

I didn't really give much thought on the typography in the trailer, I just wanted to go for a font the felt like it was industrial and made to look automated. I felt like doing this would add something to the whole feel of the trailer.


Wednesday 10 May 2017

Extended Practice: Press Pack Part 2

As well as creating a bio and synopsis for my animation I have also created a poster and four different stills from my animation to go along in my press pack. The stills that I have taken from my animation are key points and shows the different stages of transformation from the girl character to the robot character. I did this to create a very short almost storyboard look to the stills so that people know from looking at my stills what the animation will entail.













With my poster I wanted to go for a look that didn't really explain mush about the animation but it got people wondering what the animation is about. I didn't want to show the robot character in the poster as well due to the fact that I wanted it to had more of an impact when the robot is revealed in the animation. For the poster itself I wanted to keep it quiet simple and I didn't want to have too much showing, like the poster for x men days of future past where it's just the actors face with the films title and that's it. So I decided that mine will be the girl character on one side of the poster who's walking into shot, so you don;t even see her face, and then the title of the animation 'Break Away' at the top of the poster. I also added a strap line to the poster which is 'When Technology Becomes Reality', I did this because it gives the idea off that the animation is about the growth of tech and how we're walking on a thin line between CG and reality.

Tuesday 9 May 2017

Extended Practice: Final Crit / Test Edit

For the final crit I had to show my animation to my peers at Uni, this was so that they could get a better understanding of the animation and so that they could see how far I've progressed with my project. However, at this stage I'm still rendering off my animation so I didn't really have any final work to show. So what I did was I edited together a wire frames playblast of my animation so that people could see how the animation would feel, what shots I was going to use and how the animation fitted with the music that Georgie produced for me.

The test edit was just a updated version of the animatic but with more animation and it demonstrated how the particles were going to flow off of the characters. This wasn't a final edit of what the animation would be.

During the final crit I didn't really get much feedback on the test edit and what people thought of it, the only thing that was said was that the images that I also showed were a bit dark, I explained that they were dark because I had to do some colour correcting on them in Post-production. So with this information I gather that most people liked what they saw or they couldn't be bothered to give me any feedback.

Extended Practice: Creating my Bio and Synopsis / Press Pack Part 1

For this module we have to create a press pack where we have to promote our animations. Within this press pack we have to create a bio explaining ourselves and a synopsis of what our animation is about.

My Bio:

James Beardsell in a 3D animator and visual effects artist from Leeds, who is graduating from Leeds College of Art where he studied BA (Hons) Animation. He creates 3D animations sometimes with motion capture, Particles simulations and dynamic effects. James also likes to experiment with different tools within the 3D software Maya to create character elemental effects, as well as pushing his abilities in different 3d softwares and learning new techniques. If he’s not working at his part time job at Costa Coffee he is researching new ways in which he can create and develop visual effects within multiple softwares.    

My Synopsis:

Break Away is an animation which shows how people are mistaking real life for CG within films and animations because of the advancements in technology. This animation is about a performance that has been executed by a girl only to be revealed as a robot mid-way through the dance. The animation was created by using a live performer and motion capture, then placing the data from the motion capture to a 3D character which would then start to imitate the performer. Everything else was modelled in the 3d software Maya and rendered out using Arnold renderer. 

Extended Practice: Rendering

For my animation and seen as though it's 3D I decided to render my animation on a render farm. I chose to do this because it will render my frames quicker than the computers at Uni but it will be very expensive. I wanted to go with a render farm called Rebus Farming, however I found out quite late that due to the fact that I'm using Arnold renderer, Rebus Farming can't render my frames because they don't support Arnold renderer. So I quickly did some more research on render farms which support Arnold renderer, and I found a couple one which was Rendernation and the other one which was Garage farms. Rendernation looked expensive and I couldn't get a quick quote on how much it will cost to use there render farms. This left Garage Farms, there website was simple to use, and they had a plug in where you can use there software within the 3D application you're using so that you don't have to upload it to there website.


UNFORTUNATELY when I tried to render my frames using Garage Farming the frames came back where the characters were transparent (look at image below), but when I rendered it out on my computer they were fine. This set me back a bit, in money because it cost my about £500, and in time because I only have around under 2 weeks left to render 2880 frames.


So my plan is to use the computer in the AV computer room at Uni as my main render farm and then use my two big computers at home that can churn out quite a few frames in minutes. I'll spread out the frames throughout the computers and render 100 frames per computer, this will hopefully speed this along and I figure that it will take around a week to render all my frames and then a couple of days to do post production and the press pack.  

Monday 8 May 2017

Extended Practice: Trees and Lighting for Emma

The other day Emma and I sat down together and worked out the lighting and the camera angles for some shots for her animation. I also modelled some trees and a forest for her within Maya, I made them low poly to keep with the 2D theme using the toon shaders on the trees.

Because Emma wanted the trees to be a different colour to normal trees in her other world she had to experiment with colour pallets before I could start adding the shaders to the trees. So I modelled four different trees and then just duplicated them to create the forest while Emma was sorting out the colour pallets. There was a couple of problems when duplicating the trees due to the amount there were, and some of them started acting weird like trees were creating groups of themselves with out me controlling them, and others were just saying that they were an unknown node, which I didn't really understand. However, when I closed Maya and reopened the scene it fixed itself and I did't have any more problems, so thats all good now.

Afterwards Emma explained to me where the camera in the environment would be placed, so that I could render a shot of the background and send it to her. The lighting wasn't too hard to do because we decided to do most of the lighting in post production, so I just added a directional light in the scene so that the scene didn't look dull. I had to experiment with the sampling on the light because it was giving off shadows that where too sharp, which made the scene look like it was a very sunny day which we didn't want.

So doing all of the above we managed to get about half of the backgrounds finished and rendered, but for next time it won't take all day because now we know what to do it won't take too long to sort out the scenes. Below is an image of one of the backgrounds we created and rendered out where the character is in the forest.

Thursday 4 May 2017

Extended Practice: More of Modelling for Emma

More of Emma work now, I've modelled more things like a bridge, bench and a lamp post. With the three that I've just mentioned I tried to keep them as simple and low poly as possible so that when it come to texturing them it will make life easier. The bridge was made to look old, wonky and run down but I will be adding more detail to it like rope inbetween the railing and the footpath and I'll probable make some on the planks on the footpath look broken. 

The lamp pole and the bench were easy to model, however I'll be placing them into a park environment with a footpath infront of the both of them. I still need to build the park environment which entails me modelling trees and grass but hopefully this won't take too long to do. I'm modelling in a low poly aesthetic because I feel like this will fit into Emmas animation appropriately plus with the toon texturing on the models it will make the 3D environment blend in more with the 2D characters. 

At the minute I haven't fully textured the models because I'm still waiting on Emma to give me the colour sheets to use for the toon shaders. We also need to sit down and talk about where the objects are going to be placed in the environments like the park and the forest etc. I have however been able to texture the cabin fully because Emma has sorted out the colour sheets for it, so the cabin has a two shade toon shader on it giving it that cartoon 2D aesthetic. The lighting isn't correct on it yet though because I need to build a little village using the cabin but the light on it now is just an indication on how the village cabins will look in the shot.

Extended Practice: More Motion Capture with OptiTrack System

In this mocap session Katy, Mat (my tutor) and I when to Sheffield Hallam University to use there motion capture system called OptiTrack. This is where the subject wears a suit with reflective dots and performs while 12 or so cameras track there movements and place the data onto a skeleton character where you can watch the performance in real time. 

I did this because I wan't to see how easy it would of been if I used the OptiTrack system rather than using the IPI motion capture software. My answer is yes it would of been easier using the Optitrack system, however due to unfortunate events I was unable to use the system to capture my main performance. The whole experience was so much fun to do and I'd like to experiment more with the system and produce other animations using it, and I loved performing as well even though the OptiTrack system couldn't track me in my suit for some unknown reason. 

This experience was useful though as I got to experience new softwares and delve into the world of motion capture where it has made me realise that I'd love to got into this field in the industry where I can dance about and create real like motions in animation rather than keyframing because motion capture is more fun. 




Extended Practice: Creating the Particles

The particles where the fun part of the animation, this is where I could experiment with different effects on the particles and make them look dramatic when flowing off of the character. The particles were emitted from the beam texture, that way the particles can follow the transitioning from the girl to the robot. 

Due to the scale of the scene the particles had to be sized up so that the audience could see the particles. I did this because if I added more particles into the scene my computer wouldn't handle the physics and the amount of particles in the scene, I already have 5000000 particles in the scene so any more would crash the computers. Also I am unable to create motion blur on the particles due to the fact that the render time will increase because the render will have to scrub through the timeline a couple of time before rendering each frame and for my animation it will take a long time for it to scrub through the timeline. 

To make the particles actually fall I gravity field to the scene and made the floor of the studio a passive collider, this will stop the particles from falling through the floor. When I did this the particles fell in a linear fashion which wasn't appealing, so because I didn't want to stress the computers more I got rid of the gravity field. To make the particles fall with style I placed a volume axis field and manipulated the field so that the particles were being pushed downwards. I chose the volume axis field because with it you can add turbulence, this will make the fall of the particles more dramatic and interesting.       





Extended Practice: Positioning Cameras

At first I wanted the animation to be in one big shot where the camera just moved round the performer / characters because I wanted a smooth flowing aesthetic to the animation. However, doing this would mean longer render times which I done have plus it would be more complicated when it can to distributing the frames between computers due to the particles not flowing exactly.

So what I ended up doing was placing two different cameras at each end of the dance studio looking 180 degrees from each other. This way I can capture the performance in different angle while making the render time not as huge. This will also help when it comes to editing the animation in post production because I'll have more control over what I what shown in each shot and I can choose what camera has the better angle for the right shot.

I also chose to have two cameras because while the characters are dancing the data from the motion capture gets a tad jumpy making the performance a tad jumpy. To fix this problem I made it so the cameras jump from one another when the jumpyness of the performance takes place, doing this will hide the small glitches in the performance.

Looking back at my animatic the camera position was static but that wasn't my main idea back then, I did from the start want the camera to be constantly moving throughout the animation. But representing this in the animatic would make the performance look a bit all over the place and people wouldn't understand it. This is why for the animatic I kept the camera in one position and focused on the particles breaking away from the characters.    

Extended Practice: Testing Textures on characters

After creating the textures, I tried them out on the characters in the scene however I did make a couple of mistakes. The mistakes were that I made some of the image sequences for the arm in the wrong order so the arms were disappearing from the top of the shoulder rather than from the hand upwards. I must of made this mistake about two more time after that because the UV maps on the robot and girl character wasn't very clear to read, hens why I kept making the mistakes. 

But after I got every thing fixed the look of the transition from girl to robot looked quite fluid like and in time with on another. I did also have a bit of an issue where the robot character was appearing before the girl character was disappearing but I just had to change the timings of the textures in After Effects. 

Now all I need to do is add particles onto the characters and see how the transition look between the girl and the robot character. I'll be adding the particles with the beam textures I've created where the edges of the white beam are noisy deforming the outer line of the beam and emitting the particles informally and randomly to make the break away more realistic and believable.