Making of: SPEED!
So Jayne's out of town and Ascorbius takes the Vulture out to quickly run a couple of errands. He should probably have checked a few things first before tearing across the stars.
Speed was an interesting video to make, partly because it combined every trick I've learned with video editing and partly because I love the idea that in the future, petty bureaucracy still exists in 3303 - we see examples of this when trying to switch ships (See Jobsworth) or try to get docking permission either slightly too far from the station or when you've accidentally forgotten to obtain permission and have to hightail it back out - you need to reach a minimum distance before requesting. In this video, Ascorbius needs to deal with the issue of a Speed camera, and later a parking clamp.
The writing for this episode didn't take too long, It was an idea I'd had for a while, I just needed to get it into a form that would work, after a couple of versions I had something which I thought would work.
The original location for the video was to be an asteroid belt - Ascorbius would be mining and then would boost from his location on the way back to the station and get caught out by a speed camera on an asteroid - a ridiculous place to find a speed camera. The problem with this idea was that Asteroids spin and it would have made the post production process a lot more difficult. It needed to be something stationary.
So instead I went searching for locations. In 2.2, Frontier introduced a bunch of installations, which are assets from CQC re-used as locations in game you can visit. While there's no gameplay associated with them at the time of writing, they look epic and are a lot of fun to fly around.
I chose a scientific installation near to a Lava planet - which is doubly cool.. I had to visit a few of these locations to find Just the right one which could possibly have a speeding camera. I also wanted it to look like you could land your ship inside (even though currently you can't - the landing scene was filmed at a nearby station)
The filming for the video was done using the new 2.3 beta although I did take advantage of a bug in the beta which is now (sadly) fixed where the commanders head would disappear after using the Holo-Me screen. I don't actually use the Holo-Me system at all - I replace the head completely in Post Production. More on that a little later.
There were a couple of shots which took a bit of setting up - the Zoom-out from the cockpit view for one. I used the oldest trick in the book for that. A combination of scaling, masking and animating opacity.
Once filming was complete, it was time to assemble the footage into an initial raw movie - which meant recording the audio. I sent the script to Dean (Jayne) and Carl (S1Studios) for their parts and recorded Ascorbius' audio as well as the other parts for timing. I'd alter the voice to approximate the other characters by altering the pitch and adding effects in MixCraft, This works great to get the basic timing, but I'd need the final audio from the other players before I could start the lip syncing.
Because we can't currently animate the mouths or the expressions of the Commanders within Elite: Dangerous, I have to make use of other tools to animate the characters. There are a few options available - for instance in a recent video Vindicator Jones uses a piece of software called "Crazy Talk" to move the mouth of a character. That works well, but I needed a lot more flexibility than that - I like to animate the expressions as well as lip sync to the audio. Although saying that, the new version looks very powerful - I might need to give it another look.
Enter Daz Studio.
Daz Studio is a free Windows application which allows for the posing and animation of virtual actors and sets. It is very powerful and capable of rendering photo-realistic output - if you want to wait for the ray tracer to finish rendering. I didn't have time for that, so I opted to use the Basic OpenGL renderer, which has some limitations,
By taking a still from the footage and using it as a background, I was able to create and align a camera in the scene and pose Ascorbius according to the picture. I do this for each of the camera angles. It doesn't actually matter if the video footage is perfectly still or not, the actor will be composited into the scene and camera matching will take care of any bobbing around.
For the actual render, I remove the background and render a sequence of PNG images. I use PNG files rather than JPEG so I get the alpha channel transparency.
Once the scenes have been set, it's time to animate them - but first I needed the completed audio.
Sets and Scenes.
There are some additional internal scenes in this video. Notably a bar and a corridor leading to the bar. These are assets from the Daz3D store which were purchased, assembled, re-textured and lit to make it look like a bar in a space station - while still keeping a traditional bar style.
Setting up this scene actually drove home a limitation in the OpenGL renderer.. It can only handle 8 lights in the scene and I'd got much more than that. Furthermore, some of the textures had a FullBright aspect to them, led lights and screens etc.. The Basic OpenGL renderer can't handle these, so I had to use the intermediate OpenGL renderer, which was fine to a point - but as soon as Ascorbius was added to the scene the render times increased dramatically to an unusable amount. So, like in real movie making - if at first you don't succeed, Cheat!
I rendered the background scenes as stills using the intermediate renderer and rendered Ascorbius and the Barman using the basic renderer and put them together in After Effects.
After a few days, the audio came in from Dean and Carl, so I was able to put them into Mixcraft to get the timing down and prepare the final audio mix.
With the sets made, and the final audio complete, it was time to animate the lip sync.
To do this I used a plugin from the DazStudio store called Mimic Live which listens to the Desktop audio and any Mic you have connected and tries to mimic the mouth movements on the selected model. It's not perfect and has a few glitches but for the most part it does a decent enough job.
Once the lipsyncing is complete, Mimic Live also places the audio file it used within the scene so you can play it back and animate expressions in time with the audio - with enough time and patience, you can get quite a decent performance done - I had neither of those things.
The final animation was then rendered out as a sequence of PNG images from each camera angle so they can be imported into Adobe After Effects.
Compositing the Scene in After Effects
This image shows you the everything I use to make the scene in after effects.
The X's in the scene are tracking points. These are generated by the 3D Camera Tracker and are used to calculate the movement of the camera so you can make a virtual camera which matches those movements. This is scary powerful and is used a lot in my videos.
I then add Ascorbius' footage for that particular scene and camera. I make the footage into a 3D object so when the image moves around a bit (because of it being recorded with the Vive), the head moves with it.
Then we have the problem of the neck. Ascorbius has a whole body which we only need to help to position him in the scene, so we mask out everything we don't need by adding a mask to the image and because the image is tracking with the camera, the mask will too - Brilliant!
For the music, a good friend of mine Seokho Kim allowed me to use some music from his Korean Rock band called "Mechanic" however typically S1Studios provides me with sweet orchestral goodness.
For the speed camera section, I used a free speed camera model and used Cinema4D to place it within the scene. This is something I'll use more and more in the future, it's amazingly powerful and integrates nicely into After Effects.
In the final scene Ascorbius takes the vulture even though it has a parking clamp and a damaged engine. For the parking clamp I looked high and low for a 3D model but without any success, I could have built one myself using Blender, but time was pressing. So in the end I used Xara Designer (a vector graphic program I've used forever similar to Adobe Illustrator bur to use). I drew a parking clamp and used it's 3D object feature to make it psudo 3D.
If you notice carefully on the final scene, the landing gear falls off and the parking clamp hits the screen. These were 2D objects animated in 3D within after effects. I initially wanted a proper 3D model but I think It turned out really well.
For the flames coming from the back of the vulture and for the smoke in the hangar I used a program called TimelineFX which I typically use for game effects - but it does a really nice job with particle fire. I rendered the effects out as a sequence of images and applied them the same way as the Daz Studio Character.
There you go, easy peasy :)
If you like these videos, please remember to like and subscribe and leave a comment, it all really helps the channel and helps me to make more videos.
Have a good time, and Fly Aimless.
Special thanks to the awesome guys at the Josh Hawkins Brobar for their support. You guys are awesome.
This video was dedicated to my old friend and best man at my wedding Warren Sparks who sadly passed away this year after a long illness. His Eve Online character Maud was the inspiration behind Maud's bar.