Thursday, 11 May 2017

Extended Practice: Evaluation

From the start of the module I knew what I wanted to create, it’s an animation, which delves into the fine line between technology and reality. My animation is where a dancer starts to perform but then she gradually changes into a robotic figure but the transformation is revealed by particles emitting from the girl to the robot. This was because I wanted to carry on from my Context of Practice and experiment more with particle simulations. However, I wanted to try something new, something that I also experimented with but in second year, motion capture. I wanted to try motion capture because this would allow me to get realistic movements within the performance.

This meant that I’d have to learn a new type of software so that I could record the performance through motion capture. This I didn’t mind doing, actually I found this exciting, and the software I used was the ‘IPI motion capture’, which uses two Xbox kinects at a 90 degrees angle to each other. This process wasn’t easy though due to the software not calibrating or the studio I used was an inconvenience. The software did provide me with me final outcome but the data that I got from the software wasn’t tidy so I had to go into Maya and clean up the keyframes, but even doing this wasn’t easy and I was still left with some glitches in the motion capture data. Unfortunately I couldn’t re-record the performance so that I could get the best capture because the dancer that I used got a cruise ship job and had to relocate to Southampton. This was truly annoying but I learnt that for next time I’d need to capture way more performances before the dancer needs to depart.

Carrying on from my Context of Practice module where created particle simulations, the particles I used I this project came into use in a totally different way. For this project I had to learn a new way to emit my particles so that they could emit from the characters skin. Doing this I had to figure out how to emit particles from the characters textures, this seemed simple, but it was not. I had to create multiple animated image sequences that would control the way the particles were emitted. This had me thinking about timing and where the particles would emit from and in what certain way, it was hard work and a lot of trial and error but I did figure it out how using image sequences on a texture to emit particles would work, and you can see how I did it on my blog: http://j-beardsell108053-sp.blogspot.co.uk/2017/04/extended-practice-creating-textures.html

I enjoyed creating the particles because I love seeing how I could manipulate them in how they fell off of the character. I tried using different force fields within Maya to make the particles fall of in dynamic and dramatic ways, this also helped me understand what the effect of the different fields had on the physics of the particles and how they moved around in the space. I found that working with particles and textured image sequences timing is everything because if there are off timings between two sequences it can lead to big gaps between the particles coming off and the character.


There where a few issues that came about, the first being that the data on from the motion capture wasn’t as clean as I’d hoped. The other issue that happened was that I relied too much on render farming to render my animation off so I left my self two week before the deadline to do all the rendering and post production, WORNG IDEA! This was a bad idea because when it came to sending my project off to be rendered at a render farm caller ‘Rebus Farms’ it came back with an error saying that they didn’t support Arnold render, which I have been using for my animation. I quickly then had to fine another render farm that supported Arnold renderer, which I did, Garage Farms, but again I after sending them my project and spending over five hundred pounds I got some frames back and they were wrong. Unfortunately the frames that had come back had my characters doing this weird transparent effect on them that I didn’t know how to fix and the people at the render farm didn’t know how to fix either. This left me with no other choice but to start rendering off my animation at College with only a week and a half left to go.

There are a couple of things that I would do differently if I was to redo this project again, the first one would be to capture the performance in a proper motion capture studio that uses twelve camera instead of just using two. This would give me better data and a cleaner result. I’d also have motion blue on the particles as they fall of the dancer because if I’d of tried to add this on before the render times would have been insane and I would have a finished animation. I’d also spend more time researching into different render farms so that I could save myself a bit of time and money.

To conclude, I have found this whole module fun but stressful. I enjoyed working in new softwares for motion capture and learning new techniques in creating particles, but found it very stressful rendering my animation at the minute. Working with a musician was a great experience as well due to the fact that I didn’t have to think about copywriting laws or finding music that would match my animation, I had a talented musician produce a piece of music written for my animation. There are some segment in the animation that I am proud of and I will be placing them in my showreel but as a whole I feel like I could of done better with the right equipment and software i.e. using a OptiTack system to capture my performance.

      

Extended Practice: Editing animation together / Post Production

Editing together my final animation and compositing it wasn't too hard to do, but I did have to wait until all my renders were done so that I could finish it. The way I had to edit my animation together was that when I rendered out my animation I only rendered out a hundred frames at a time so that meant me having to go through loads of file and importing about 40 odd different folders which had image sequences in so that I could place them on the timeline. This was a bit tedious to do but it made me think more about different shots and what would look better at what point in the animation i.e. a back shot or a front shot.


I did all the credits in After Effects and I use a plug-in called 'Trapcode Particular' for the main title. I did it this way because I wanted the title to represent the animation but I didn't want to go for any crazy typography. I managed to create a title that uses particles using some simple methods of masking and particle emission, this was also to add more visual effects in to the animation and it shows the audience from the start what's going to happen in the animation.


I also did some colour correcting on the animation using the 'Lumetri Colour' where I basically just experimented with the contrast, exposure, shadows, highlights etc. This helped pop and define colours within the animation and make the animation look like a professional film. As well as this I placed another effect onto the composition and added some black bars across the top and bottom of the screen to give it that widescreen / cinema look and feel to it.

Extended Practice: Sound Design

I can't remember if I've mentioned this before but for the sound I got a student from Leeds College of Music to score a track for me so that I'd have some original music and not have to deal with copy-write infringements. Georgie Ward is the girl that I got to help me with the music that would go with my animation, I chose her because of her techno like genre of music she produced and it would fit in well with the aesthetic of my animation.

I found that working with her was very easy and we got along great, there were not arguments over how the music should sound, if I wanted something changing she was happy to apply the changes. There was a time though where we spent at least three weeks just not communicating to each other because we were too busy doing our own thing, but it didn't really effect the music that much so it didn't really matter.

Georgie's work ethic was good, she started of sharing a playlist with me on soundcloud so that I could get a better feel of her work and then I could choose out of the tracks that she gave me which one I thought would suit my animation best. She then did multiple examples of the track that I chose so I could get a better feel for the track and again I picked the better one so that she could develop that one further. This is what she came back with:


After listening to the track and creating the animation I messaged her back to ask if it could be possible to make the track only 2 minutes and 10 seconds long because that's how long my animation is. Fortunately her response was yes and she edited the track down to fit me specifications, but she also added more into the track to give it a but more personality. So this is what I got back from her:



Extended Practice: Press Pack Part 3

The third part of the press pack we have to make a trailer for our animation, so I wanted mine to be dramatic but simple. To do this I made it so that only two or three seconds are shown at a time but with long pauses of blackness inbetween to build up the tension. I have only included the first few snippets of the animation so that I don't show too much off and give anything away.

I edited this together using Premiere Pro and made the short clips fade in and out of each other, I also placed all the credits at the end and none at the beginning so that the name of the animation will be revealed at the end of the trailer. I feel like there are some good bit that I could of placed in to the trailer but I feel like if I did that I would of given too much of the animation away and spoilt the whole experience of the full film.

I didn't really give much thought on the typography in the trailer, I just wanted to go for a font the felt like it was industrial and made to look automated. I felt like doing this would add something to the whole feel of the trailer.


Wednesday, 10 May 2017

Extended Practice: Press Pack Part 2

As well as creating a bio and synopsis for my animation I have also created a poster and four different stills from my animation to go along in my press pack. The stills that I have taken from my animation are key points and shows the different stages of transformation from the girl character to the robot character. I did this to create a very short almost storyboard look to the stills so that people know from looking at my stills what the animation will entail.













With my poster I wanted to go for a look that didn't really explain mush about the animation but it got people wondering what the animation is about. I didn't want to show the robot character in the poster as well due to the fact that I wanted it to had more of an impact when the robot is revealed in the animation. For the poster itself I wanted to keep it quiet simple and I didn't want to have too much showing, like the poster for x men days of future past where it's just the actors face with the films title and that's it. So I decided that mine will be the girl character on one side of the poster who's walking into shot, so you don;t even see her face, and then the title of the animation 'Break Away' at the top of the poster. I also added a strap line to the poster which is 'When Technology Becomes Reality', I did this because it gives the idea off that the animation is about the growth of tech and how we're walking on a thin line between CG and reality.

Tuesday, 9 May 2017

Extended Practice: Final Crit / Test Edit

For the final crit I had to show my animation to my peers at Uni, this was so that they could get a better understanding of the animation and so that they could see how far I've progressed with my project. However, at this stage I'm still rendering off my animation so I didn't really have any final work to show. So what I did was I edited together a wire frames playblast of my animation so that people could see how the animation would feel, what shots I was going to use and how the animation fitted with the music that Georgie produced for me.

The test edit was just a updated version of the animatic but with more animation and it demonstrated how the particles were going to flow off of the characters. This wasn't a final edit of what the animation would be.

During the final crit I didn't really get much feedback on the test edit and what people thought of it, the only thing that was said was that the images that I also showed were a bit dark, I explained that they were dark because I had to do some colour correcting on them in Post-production. So with this information I gather that most people liked what they saw or they couldn't be bothered to give me any feedback.

Extended Practice: Creating my Bio and Synopsis / Press Pack Part 1

For this module we have to create a press pack where we have to promote our animations. Within this press pack we have to create a bio explaining ourselves and a synopsis of what our animation is about.

My Bio:

James Beardsell in a 3D animator and visual effects artist from Leeds, who is graduating from Leeds College of Art where he studied BA (Hons) Animation. He creates 3D animations sometimes with motion capture, Particles simulations and dynamic effects. James also likes to experiment with different tools within the 3D software Maya to create character elemental effects, as well as pushing his abilities in different 3d softwares and learning new techniques. If he’s not working at his part time job at Costa Coffee he is researching new ways in which he can create and develop visual effects within multiple softwares.    

My Synopsis:

Break Away is an animation which shows how people are mistaking real life for CG within films and animations because of the advancements in technology. This animation is about a performance that has been executed by a girl only to be revealed as a robot mid-way through the dance. The animation was created by using a live performer and motion capture, then placing the data from the motion capture to a 3D character which would then start to imitate the performer. Everything else was modelled in the 3d software Maya and rendered out using Arnold renderer. 

Extended Practice: Rendering

For my animation and seen as though it's 3D I decided to render my animation on a render farm. I chose to do this because it will render my frames quicker than the computers at Uni but it will be very expensive. I wanted to go with a render farm called Rebus Farming, however I found out quite late that due to the fact that I'm using Arnold renderer, Rebus Farming can't render my frames because they don't support Arnold renderer. So I quickly did some more research on render farms which support Arnold renderer, and I found a couple one which was Rendernation and the other one which was Garage farms. Rendernation looked expensive and I couldn't get a quick quote on how much it will cost to use there render farms. This left Garage Farms, there website was simple to use, and they had a plug in where you can use there software within the 3D application you're using so that you don't have to upload it to there website.


UNFORTUNATELY when I tried to render my frames using Garage Farming the frames came back where the characters were transparent (look at image below), but when I rendered it out on my computer they were fine. This set me back a bit, in money because it cost my about £500, and in time because I only have around under 2 weeks left to render 2880 frames.


So my plan is to use the computer in the AV computer room at Uni as my main render farm and then use my two big computers at home that can churn out quite a few frames in minutes. I'll spread out the frames throughout the computers and render 100 frames per computer, this will hopefully speed this along and I figure that it will take around a week to render all my frames and then a couple of days to do post production and the press pack.  

Monday, 8 May 2017

Extended Practice: Trees and Lighting for Emma

The other day Emma and I sat down together and worked out the lighting and the camera angles for some shots for her animation. I also modelled some trees and a forest for her within Maya, I made them low poly to keep with the 2D theme using the toon shaders on the trees.

Because Emma wanted the trees to be a different colour to normal trees in her other world she had to experiment with colour pallets before I could start adding the shaders to the trees. So I modelled four different trees and then just duplicated them to create the forest while Emma was sorting out the colour pallets. There was a couple of problems when duplicating the trees due to the amount there were, and some of them started acting weird like trees were creating groups of themselves with out me controlling them, and others were just saying that they were an unknown node, which I didn't really understand. However, when I closed Maya and reopened the scene it fixed itself and I did't have any more problems, so thats all good now.

Afterwards Emma explained to me where the camera in the environment would be placed, so that I could render a shot of the background and send it to her. The lighting wasn't too hard to do because we decided to do most of the lighting in post production, so I just added a directional light in the scene so that the scene didn't look dull. I had to experiment with the sampling on the light because it was giving off shadows that where too sharp, which made the scene look like it was a very sunny day which we didn't want.

So doing all of the above we managed to get about half of the backgrounds finished and rendered, but for next time it won't take all day because now we know what to do it won't take too long to sort out the scenes. Below is an image of one of the backgrounds we created and rendered out where the character is in the forest.

Thursday, 4 May 2017

Extended Practice: More of Modelling for Emma

More of Emma work now, I've modelled more things like a bridge, bench and a lamp post. With the three that I've just mentioned I tried to keep them as simple and low poly as possible so that when it come to texturing them it will make life easier. The bridge was made to look old, wonky and run down but I will be adding more detail to it like rope inbetween the railing and the footpath and I'll probable make some on the planks on the footpath look broken. 

The lamp pole and the bench were easy to model, however I'll be placing them into a park environment with a footpath infront of the both of them. I still need to build the park environment which entails me modelling trees and grass but hopefully this won't take too long to do. I'm modelling in a low poly aesthetic because I feel like this will fit into Emmas animation appropriately plus with the toon texturing on the models it will make the 3D environment blend in more with the 2D characters. 

At the minute I haven't fully textured the models because I'm still waiting on Emma to give me the colour sheets to use for the toon shaders. We also need to sit down and talk about where the objects are going to be placed in the environments like the park and the forest etc. I have however been able to texture the cabin fully because Emma has sorted out the colour sheets for it, so the cabin has a two shade toon shader on it giving it that cartoon 2D aesthetic. The lighting isn't correct on it yet though because I need to build a little village using the cabin but the light on it now is just an indication on how the village cabins will look in the shot.

Extended Practice: More Motion Capture with OptiTrack System

In this mocap session Katy, Mat (my tutor) and I when to Sheffield Hallam University to use there motion capture system called OptiTrack. This is where the subject wears a suit with reflective dots and performs while 12 or so cameras track there movements and place the data onto a skeleton character where you can watch the performance in real time. 

I did this because I wan't to see how easy it would of been if I used the OptiTrack system rather than using the IPI motion capture software. My answer is yes it would of been easier using the Optitrack system, however due to unfortunate events I was unable to use the system to capture my main performance. The whole experience was so much fun to do and I'd like to experiment more with the system and produce other animations using it, and I loved performing as well even though the OptiTrack system couldn't track me in my suit for some unknown reason. 

This experience was useful though as I got to experience new softwares and delve into the world of motion capture where it has made me realise that I'd love to got into this field in the industry where I can dance about and create real like motions in animation rather than keyframing because motion capture is more fun. 




Extended Practice: Creating the Particles

The particles where the fun part of the animation, this is where I could experiment with different effects on the particles and make them look dramatic when flowing off of the character. The particles were emitted from the beam texture, that way the particles can follow the transitioning from the girl to the robot. 

Due to the scale of the scene the particles had to be sized up so that the audience could see the particles. I did this because if I added more particles into the scene my computer wouldn't handle the physics and the amount of particles in the scene, I already have 5000000 particles in the scene so any more would crash the computers. Also I am unable to create motion blur on the particles due to the fact that the render time will increase because the render will have to scrub through the timeline a couple of time before rendering each frame and for my animation it will take a long time for it to scrub through the timeline. 

To make the particles actually fall I gravity field to the scene and made the floor of the studio a passive collider, this will stop the particles from falling through the floor. When I did this the particles fell in a linear fashion which wasn't appealing, so because I didn't want to stress the computers more I got rid of the gravity field. To make the particles fall with style I placed a volume axis field and manipulated the field so that the particles were being pushed downwards. I chose the volume axis field because with it you can add turbulence, this will make the fall of the particles more dramatic and interesting.       





Extended Practice: Positioning Cameras

At first I wanted the animation to be in one big shot where the camera just moved round the performer / characters because I wanted a smooth flowing aesthetic to the animation. However, doing this would mean longer render times which I done have plus it would be more complicated when it can to distributing the frames between computers due to the particles not flowing exactly.

So what I ended up doing was placing two different cameras at each end of the dance studio looking 180 degrees from each other. This way I can capture the performance in different angle while making the render time not as huge. This will also help when it comes to editing the animation in post production because I'll have more control over what I what shown in each shot and I can choose what camera has the better angle for the right shot.

I also chose to have two cameras because while the characters are dancing the data from the motion capture gets a tad jumpy making the performance a tad jumpy. To fix this problem I made it so the cameras jump from one another when the jumpyness of the performance takes place, doing this will hide the small glitches in the performance.

Looking back at my animatic the camera position was static but that wasn't my main idea back then, I did from the start want the camera to be constantly moving throughout the animation. But representing this in the animatic would make the performance look a bit all over the place and people wouldn't understand it. This is why for the animatic I kept the camera in one position and focused on the particles breaking away from the characters.    

Extended Practice: Testing Textures on characters

After creating the textures, I tried them out on the characters in the scene however I did make a couple of mistakes. The mistakes were that I made some of the image sequences for the arm in the wrong order so the arms were disappearing from the top of the shoulder rather than from the hand upwards. I must of made this mistake about two more time after that because the UV maps on the robot and girl character wasn't very clear to read, hens why I kept making the mistakes. 

But after I got every thing fixed the look of the transition from girl to robot looked quite fluid like and in time with on another. I did also have a bit of an issue where the robot character was appearing before the girl character was disappearing but I just had to change the timings of the textures in After Effects. 

Now all I need to do is add particles onto the characters and see how the transition look between the girl and the robot character. I'll be adding the particles with the beam textures I've created where the edges of the white beam are noisy deforming the outer line of the beam and emitting the particles informally and randomly to make the break away more realistic and believable.   



Saturday, 22 April 2017

Extended Practice: Creating Textures

For the particles to emit from the girl character I had to create a texture that has a white beam running across the UVs. I'll also need to create two more textures, one that will make the girl character disappear, and the other is to make the robot character appear. These two textures are basically white and black blocks covering the UV maps of the characters.

 For the emittion beam texture I had to create multiple beams that ran across the UV map of the girl character. I created this in After Effects using the beam effect and I used levels to sharpen the the beam. I also added in a noise effect called 'fractal noise', this made the edges of the the beam look rough just so that when the particles are emitted from this beam they don't emit in unison.


Moving on to the disappearing effect of the girl character, I did this by covering up the UV map using black boxes because black will translate as being transparent and so will make the girl character disappear. To do this I used the masking tool in After Effects and went over the UVs with the masks to reveal the black box hiding the UVs.


Finally for the appearing effect of the robot character I did that same thing as above, however I changed the colours round so instead of the black boxed appearing over the UV map it was the white boxes appearing. This way the robot could appear as the girl character disappeared. 



 

Extended Practice: Cleaning Up The Animation

At first I thought that cleaning up the animation would by a simple task to do but a long one, it turned out that with all the information that was in the scene and the 2880 amount of frames in the animation it was too much for my computer to handle and it kept crashing when the ram hit full capacity. This meant that I had to do the cleaning up in stages so that my computer could handle the workload.

Cleaning up the animation was still simple to do because all I had to do was move keyframes around so that some arms and legs weren't merging in with the body. And like I said before in a previous post the hardest bit was when the characters went into the floor and I had to lift the characters up frame by frame, so that was a bit tedious to work with.

Before I could start cleaning the movements up though I had to made sure that the characters where in the same position throughout the animation because if not than you'll be able to tell that there is another character underneath the girl character. Due to this that meant that I had to scrub through the timeline and find any parts that the two characters were not together and place them together and keyframe that so that they'd stay together.

There is one thing though that I need to clean up every time I open up the project and that is the wall in the background with the windows on it. This is because the texture on that wall, the bricks, for some reason won't map correctly, so whenever I open the scene the texture remaps itself and I keep having to correct this all the time. I don't know what's going on with it so I'm going to have to speak to one of my tutors so that they could help me with this situation and fix the problem.

Extended Practice: Mocap Data on Characters

Now that I had all the data sorted out and I had placed everything together, it was now time to start placing the data onto the characters. This was simple enough to do, I did this by using human Ik in Maya and then creating character definitions, meaning that I went through the skeletons in of the character and assigning the right joint to the matching joint in the human IK window. This allows Maya to know how the characters will mirror the motion capture data onto the skeleton. 

As you can see by the characters in the video below some of the arms and legs go into one another, which isn't good. This'll mean that I'll have to manually go into the graph editor and move some keyframes around so that this doesn't happen. Doing this shouldn't take too long, but that part where the characters go through the floor might take some time cleaning up because I'll have to reposition the characters so that they're not going through the floor. 

Overall I think the motion capture data has transferred well across to the characters and most of the movements are smooth. There obviously are some amendments that I'll have to deal with like placing head and hands movements into the animation because if I don't it will make the whole thing off putting to watch and just distract views from what is really meant to be happening in the animation. The hands will be simple enough to animate, all I'll have to do is place a controller on the hands, which manipulates the fingers to bend, and for the head I'll just have to keyframe some movements so that the head isn't focused in one position.             


Extended Practice: Bringing Things Together

At this point I wanted to see what everything would look like together so I opened up the background scene and placed the characters in the studio. This was to show me how the characters would look in the studio and how the lighting would look with the characters. As you can see by the pictures below the character looks quite dark in the studio but that is the idea, I wanted the environment to be dark so that it would look more eerie when she starts to transform into the other robot character. 

I've done some render test timing and at the minute on my machine at home it takes just under two minutes to render out the images you see here. However, this will change when particles start to enter the scene, so I estimate another one to three minutes added on to the two minutes it already takes to render. Seen as though I'm using a render farm instead of rendering the whole thing off myself this will cost quite a bit of money, like I'm talking two to five hundred pounds to get this rendered off in time for submission.    



Extended Practice: Final Mocap Recording

After my attempt at recording the performance in a studio I decided that I'll just have to record the performance at Uni. This is because the room at Uni was so much better at picking up the background and the dancer, plus it gave the dancer a better understanding of where she could dance so that the cameras will catch the performance.

I started off by recording Katy, who wasn't the main dancer I had because the main dancer was busy that day. She tried her best to learn and perform the dance that the other dancer and I had put together, however she was stressing out about her not getting the dance right and how she didn't want to ruin me project. It was getting to much for Katy to do so I called up the main dancer and she agreed to come down and perform one last time for me because she wan't going out that day until the afternoon and it was still the morning.

The main dancer finally arrived and so we were able to begin recording the performance. She only had to perform the dance twice, once to go over the choreography and get warmed up and the other time she performed the dance so well we didn't need to capture another after that. So all in all the main dancer was only present for like half an hour and then we were finished. We were both happy with the final performance and so all that was left to do was bake the data from the motion capture onto a character to see how it turned out.


Overall the final result of the capture is good, but there will be some amending and changes to the performance so that the character doesn't have any limbs intruding the body. Also at the end the character walks around and heads towards the back, at this point the cameras didn't capture the dancers legs so she looks like she is floating so I'll have to fix that is Maya using the keyframes and graph editor.

Extended Practice: The Transformation

The transformation is an experiment to show me, and others, how the change between characters will go about and look. This will also give me a good idea how I'm going to make the particles fall off the character, this will be done in order where the arms will change first and then the legs following on with the body and head. 

Below you will see my experiment in motion, showing how the particle fall off and in what order. I did this using textures and image sequences on the textures, where I used two colours white and black. White was used to show the robot character appearing and black was used as the transparency so that the girl character could disappear. I also used another texture where a white beam is moving across the black uv map of the character, this is so the particles can emit from the moving white beam which will give off the effect that the character is dissolving. 

This experiment has shown me that I need to work out the layout of the uv mapping of the characters because as you can see the transitioning between the to characters isn't as smooth as I'd like. it has also shown me that the look of the particles flying off of the character looks convincing, however a lot more tweaking will need doing to the attributes of the particles to make the transition more effective.     

Tuesday, 28 March 2017

Extended Practice: IPI Software Mishap With Different Studio

Due to the fact that the IPI software can only capture a performance 2-4 meters away for the camera, the performer had a very small space to dance around. So I thought that it would be better for me and the dancer to go to a studio that had a big space and mark out the performance area for the dancer, so that they had more an idea of what space she had to perform in.

For the IPI software to work the flooring had to non reflective and the room had to have no natural lighting. So I booked a studio which had padded flooring and blinds on the windows to make sure I had no reflections and natural lighting. However, two problems accrued that day, the fist one being that I could not for the life of me calibrate the software and let it know where the floor was and what space it has. I have know idea why the calibration wasn't working and I mush of tried it about ten different times but they all failed. The second problem was that there was still too mush natural lighting coming into the room even though I closed the blinds, so this made it harder for the IPI software to figure out the depth perception.

I didn't want to just give up so I captured the performance without calibrating the software first because I thought that I could just do that at Uni in the room I've been using before. But when I calibrated at Uni I didn't take into consideration where the cameras where in the studio so when it came to baking the data onto the character, I got two different data projections in the scene because the cameras weren't in the same place as they where in the studio. The image below demonstrates the issue I was having:

As you can see there are two different data projections in the scene so I couldn't position the character into the data because the two projections where from different positions which made one data higher than the other data (if that made any sence?).

This just means I'm going to have to capture the performance again but in Uni and calibrate it properly. And it means that I've just wasted £84 on a studio that didn't work for me.
  

Extended Practice: IPI Second Test With Katy And I

After testing out the software of the first time, I thought it would be a good idea to do another test where I experiment with the software capturing fast motion. So to do this I grabbed Katy, who was also a dancer, and got her to do a couple of dance moves for me testing out the speed of which how fast you can kick a leg and the cameras capturing that speed in good quality. This little exercise I feel went pretty well and the IPI software was able to capture most of the performance. 

I had the character and the data separate when I exported the animation so that I could get a good understanding of how the two looked different and how much detail got captured from the performance. The only problems with the performance on the character is that some limbs do go into the body and also the back does look like it's broke at one point in the animation. The character in this video isn't on that I build I just got it from a software called 'Make Human' as a reference of what a character would look like with the motion capture data on it.
  

I also made another test using myself as the performer to experiment with the different angles and poses you could do in the space that the software could pick up. I also did this so that I could have a go at performing because it looked fun to do. I noticed as well that in my performance the IPI software didn't do well in capturing feel because there is a part in my performance where I stand still and rotate my body left and right but when that got captured the software read that movement as if I was moving my whole body left ad right, which meant my feet was moving like they were hovering.

Extended Practice: IPI First Test

This was the first time using this software and to be honest it was really fun to use. I got my tutor Annabeth to teach me how to use the software and to perform for me. I realised very quickly that when using this software you only get a two meter space from each camera to perform in, so I'll have to rethink the performance that I've choreographed with Sophie (the dancer). This isn't too much of a problem but I will have to set time aside to go to a studio and re-choreograph the performance.

It doesn't take too long to set up the program and record stuff the only tedious part of the program is when you bake the data from the performance onto the character, because you have to watch out for parts of the body not tracking properly. For example the arm could get stuck inside the body and stay there while the data for that arm carries on moving around not baking the arm properly.

There are a few things as well to look out for when using the IPI software, i.e. there are some poses you can do due to your arm or leg won't get captured, your fingers won't get captured and your hand will just have straight fingers throughout the animation, and the motion of your head won't get captured unless you have a Wii remote or a Play Station remote stuck to your head.    

Extended Practice: IPI Motion Capture Software

I'm at the stage to start production now, so this mean its motion capture time. There were two ways in which I was going to capture the performance, the first way involved my tutor Mat and I going to Sheffield Hallam Uni and using there Motion Capture facilities where you use a suit with the dots on and have all the cameras round you capturing the performance. The second way involved me having two xbox kinects and a laptop with the IPI motion capture software installed on it, and someone standing in the middle of the kinects performing and capturing it that way.
In the end I ended up using the second option with the IPI software because unfortunately the facilities in Sheffield Hallam Uni was fully booked until April, which would be too late for me to start capturing then.

The IPI software is used by having one or two xbox kinects, at a 90 degrees angle, pointed at a performer so it can capture the performance using infrared tracking and depth perception. The IPI software is an easy to use motion capture program which offers a 30 day free trial (which I used) so that you can have a taste of the capabilities that the IPI software can product. Using the IPI Software wasn't too hard to do, the only thing I had difficulty doing was the calibration of the scene before you start capturing the performance. This was because of many things like the lighting wouldn't be right, the board used for calibration was either too small or too see-through or just the software couldn't capture the board properly.


     

Extended Practice: Robot Character Dancing

I've also produced a short dance with the robot character using some mocap samples from the Maya to see how the particles would fall off the character. This animation was only done in low resolution because I was more bothered about the flow of the animation and not the look, I also wanted to see how long it would take for the computers to render this animation. It turned out that it took about 16-20 hour to render the full animation, which is a bit worrying due to the fact that this animation is only 260 frames long and my final animation will be about 2880 frames long. The fact that the animation took this long to render is because I added motion blur to the scene so when it was rendering, the timeline had to scrub through the animation three times before rendering the frame so that it could get a good idea how the particles were going to blur.

Looking at this animation though I will change how the weights are painted on the robot character because the mesh it looking a bit deformed in some poses. I also will change how the particles will fall off the character, and maybe just have a gravity field in the scene so that when the character spins round the particles fly off in the direction of the spin.

Monday, 13 March 2017

Extended Practice: Dissolving The Girl Character

I realised that when I've been testing with the dissolving effect I've been doing it on the robot character, even though in my animation I won't be dissolving the robot character it will be the girl character that's going to be dissolving. So what I did was I used the automatic mapping tool in Maya to unwrap my characters UV texture map, and then used a ramp shader to move along the UV map of the character to create the disappearing effect and to use the ramp shader as the particle emitter. 

This created a different disappearance effect on the characters than the previous tests that I've done on the robot character. This disappearance effect looked like the character was disappearing in little blocks or pixels, which looked quite strange but I presented the animation to my peers and they said that the effect actually suited the aesthetic of my animation. The effect had something to do with the way the automatic mapping unwrapped the girls UV texture map.         


What I'm going to do is create another way of making the girl character disappear so that she does't vanish in small square pixels to see how it looks. If it looks better than obviously I'll be using that way in my final project.

Extended Practice: Cave building

Building the cave I felt was quite a task to do because it was my first time doing it and I didn't really know where to begin. I started by looking online at tutorial videos on YouTube but most of the videos were about building caves in Blender not Maya, however the technique was still the same. I started off by building the cave from a cylinder object and then just deformed the inner side to give it that natural nonuniform look, then I added some stalagmites and crystals into the cave. I also textured the cave using bump maps and normal maps  to give the cave more of a realistic look to it. This whole process took me about one to two hours to create the cave so it didn't take too long and this is what it looks like (sorry if it's a bit dark):


After showing this to Emma she told me that it was too realistic and it won't match the style of her animation, which now when looking at it is right because her animation is quite cartoony looking so this will totally stand out in the animation if I was to put it in there. I'm glad I told her to come and look at it so that she could tell me this now and not later when I'd spend more time in it making it look more realistic. 

So starting again I created simple flat shaped and modelled the cave out of them, so this time I wasn't using a cylinder as a starting point. To create the cave I made four planes and placed them together to make a square, but I also deformed them a little just so that the planes don't look straight and uniformed. I did still have stalagmites an crystals in the cave because that's what Emma wanted but I also added a couple of rocks on the floor. For the water that was going to be in the cave I just made place holders just to give Emma and I some reference so that she knows where the water is when she starts animating. (again sorry if the image is dark) 


Emma much prefers this design than the previous one because it look more cartoony and low poly so it will fit in with her animation well. I used toon shader to texture the cave and to give it that two tone colour look. I also added a mesh glow on the crystals so that we didn't have to create a glow in post production, creating the glow was easy I just went into the attribute editor and in the special effects panel I just increased the glow. One thing that will change about this cave in the colour of it because Emma wanted the colours to be grey and blue, I just did the cave brown for my purpose of seeing how the toon shader would look on the cave.   

Sunday, 12 March 2017

Extended Practice: Animatic

Before creating an animatic you'll need a storyboard however, with my animation there really isn't much story it's just someone dancing and then they disintegrate revealing a robot in there place. So I thought that I'd do an animatic first that shows the main stages on the transformation and what actions the characters will be doing along with the sound track I got from the music student Georgia I've been working with. The reason that the animation doesn't really have a story is because I wanted to create something that demonstrated my abilities and that would look good in my showreel / portfolio. 

Within this animatic I've also placed the video I took of the dancer when we were rehearsing and choreographing the dance so that people would get a better understanding of the movements and the performance in the animation. I've had feedback from people watching this animatic and they've said that they really like the beginning where the dancer walks in-front of the camera because it adds drama to the performance and they've also said that they understand how the girl character will transform into the robot character more wen they watch the animtic. Another thing said was that it might be good if I experiment with the camera angles and have the camera weaving in and out from the characters.          



Extended Practice: Emma's Environments (Cabin)

Starting with the cabin I tried to model it in a old, rundown 'cabin in the woods' kind of aesthetic, but also making the cabin look toony as well. To make the cabin look toony I gave it a curved exaggerated roof and wonky wooden panels. I wanted to have a lot of detail on the cabin just so that it gave the cabin more character and made it look like it had a history. I did this by placing wooden stilts to hold up the roof, making the house stand on rocks and wooden beams and making some of the wooden frames on the fence tilted and worn out. I still feel like expanding the detail on the cabin by adding some stubs in-between the steps and maybe placing a lamp on the side of the porch. 


I also created some UV texture maps for Emma so that she could texture it how she wanted so that the style of the character will match the style of the environment. Below you will see two different texture maps, the fist one is a map of the windows on the cabin. I feel like it demonstrates the layout of the window clearly and when I showed this map to Emma she understood what was going on in the map. I've separated the window texture map from the cabin texture map because they are two different meshes, plus with the windows the character needs to be seen through the window.


The second UV texture map is of (if you've guessed it) the cabin, where I've tried to keep it simple with minimal separations of the mesh to make the UV map flat. The map I feel is easy to follow and some bit are easier to figure out what they are then others. The big rectangle like shape at the top left hand corner is the roof, top right is the end bits of the roof at the back, below that in the back of the cabin, underneath that is the boor rims and just underneath that to the left a bit, is the underneath part of the overhang on the roof. The rest I believe is pretty straight forward and you'll understand what hey are.