top of page

Making Videos: The tale of many tools!


It's been over a year now, I've been making Elite Dangerous videos with the hope of bringing a little joy into people's lives for about 18 months. Early videos employed mostly in-game footage but as time went on, I found myself wanting to tell comedy videos instead of just recording play sessions, which meant game footage alone wasn't going to be enough.

This meant finding ways to do things with the camera system which were not available in the game. Most notably, the ability to have physical acting, at least in some capacity.

Since I started making videos, the content creation tools available within the game have improved dramatically. I still have fond memories of the original Debug camera system, but in May of 2017, everything changed, Elite Dangerous 2.3 was released and we suddenly had access to a very new and powerful camera system which allowed for some very nice footage to be captured. We even got an avatar creator.

However, this wasn't powerful enough. The avatars could not speak or act, unless by acting you mean aimlessly looking around the cockpit with a vacant expression on their face - no matter the context while their hands move to perhaps reduce cramp.

E:D 2.3 even came with new problems; I record in VR and the VR headless neck used to be stationary with a tall collar, now the neck and the clothes would move as if a head was attached, On top of this, there were some bizarre new rules for camera distances and a new and highly irritating blur effect would appear if the game determined something was blocking the view to the ship.

(if anyone from Frontier is reading this, please make the occlusion blur effect go away.. it serves no purpose and has killed more great shots than any other factor)

Anyway, I digress.

Over the last 12 months, I've used 3 techniques to create talking characters, each using some form of 3D rendering.

Method No1

Solely using Daz studio for animation and rendering, Creating long sequences of png images and overlaying them onto the in-game footage and masking out any parts of the commander's suit which needed to obscure the neck. This was the technique used in all videos from The Villainous up to the Mining Class video.

This method, while perfectly functional suffered in some aspects due to render times when using the more powerful iRay ray tracer and a distinct lack of quality with the basic OpenGL renderer. It was further hampered by a lack of automation when it came to rendering multiple camera angles. Each angle had to be rendered one at a time manually.

It also suffered with the limited number of possible lights available with OpenGL - only 8. Because of this, the video Speed had to be made using several rendered layers. I wrote an article about that if you're interested.

Method No2

Following that technique, I decided that I needed to remove the Daz renderer from the process, The camera and render limitations were taking far too much work.

After a lot of searching, I discovered Element3D; A powerful 3D rendering plugin for AfterEffects.

I'd use the powerful 3D engine it provided directly within after effects and use it to render a sequence of 3D meshes which had been exported from Daz studio after lip syncing and animation had been completed.

This gave me a lot of control over the camera and lighting but the export process from Daz meant creating a ridiculous amount of meshes (an obj sequence) - which were enormous in terms of disk footprint,

To combat this problem which was causing both extreme render times and disk space usage, I needed to decimate the meshes prior to exporting to reduce the filesize.

Then I needed to set the textures of the meshes within Element3D and finally position the camera, and track the footage within After effects to match the camera movements. Quite time consuming.

After initially great results, It became apparent that the export process was flawed and it would occasionally drop frames which lead to out of sync footage - requiring constant correction on the timeline. The renderer was very nice though.

Each of these tools, whilst powerful had a similar limitation. They were very expensive in terms of time efficiency.

Time is a very precious commodity when it comes to video production. The Scientific method took around 3 months to make, it was the first video made using the Element3D plugin. I made 3 videos using the plugin not to mention a part for the Christmas finale, however they all suffered from the loss of synchronisation which meant the lip syncing would fail from time to time.

There was a further problem, because the animation had to be baked, any changes in the animation would require a complete re-export, which brought with it the risk of other frames being dropped.

If I was going to continue the series, I needed another way.

I needed a tool which I could animate easily, had good lip syncing capabilities and would allow me to render footage to overlay onto the Elite Dangerous footage, all while keeping the amount of time wasted to a minimum.

Method No3

I happened upon a product called iClone. It promised exactly the features I needed. It had a built in renderer which was capable of some incredible results and it was real-time, which meant I could spend my time improving the animation.

It had a fantastic lip syncing tool too, you give it an audio file, it generates animation by placing visemes (mouth shapes for a given sound) on the timeline for you to manually alter afterwards if needed accordingly.

(to be fair, I have to manually alter every single viseme)

But I can now get a level of accuracy which was previously unavailable.

It also has a good camera system which can also be switched on the timeline meaning entire sections of footage can be rendered without even touching after effects.

All these great things seemed too good to be true...

and with all things too good to be true, they often are.

The renderer doesn't support as many shaders as Element3D, so Holograms are not as easy to make.

There was one pretty major drawback,

There's also no simple method to use moving cameras through camera tracking - something which I had come to rely upon for all previous methods.

Apparently it is possible with a plugin from AfterEffects to export camera motion and import it into iClone.

It's also a fairly closed system in that It doesn't naively support anything apart from its own internal file formats.

Instead geometry from other applications needs to be imported via a bolt on program called 3DExchange - which of course is not free, neither is iClone.

It does mean however that I can export rigged and morphed characters from Daz Studio into 3DExchange, fix up any expression issues, textures, and then import into iClone and save it in the library.

Once they character has been converted, they can then be dropped into any scene to be animated.

Animation

Talking about animating, there are some cool toys to play with within iClone. There's the face puppet, motion puppet, direct keyframe animation, Inverse Kinematics and a plugin (which has to be purchased separately) which reads from a Microsoft Kinect in real time to provide motion capture.

In reality, the animations are a little floppy, and it has a difficult time with anything complicated, so turning on the spot is hard - but it's good for acting expression. The animations can be tweaked though and you can anchor limbs in place to keep things like feet and hands in place.

To make more sweeping changes to animations you have to get an additional plug-in (again, for a fee) but then you can modify animation curves and so on.

Animating a scene in a wholly CGI environment is relatively straight forward once the animation concepts have been worked out, you Animate and then you Render and the scene is complete,

For my videos there are scenes where I have to overlay animated CGI heads onto game footage. This sounds like a simple task, but in fact it's the most time consuming part of an episode. Full CGI scenes are actually easier.

There are 3 ways to approach it.

A static image from the footage to act as a background, or in the case of more complex scenes where lighting changes or even the camera position, you need to embed video background.

Finally, using a green background.

To use video as a background,

First of all, the video footage needs to be exported from After Effects in a lossless format (iClone has a difficult time with compressed footage, it skips and splutters and loses sync - so be prepared to lose some hard drive space for a time).

If the camera position in the footage moves too much.. you have more work ahead, I believe it is possible to export the camera position after camera tracking footage via a script in After Effects, I've not done this myself yet - but it might be very useful.

To get your characters looking like they belong in the scene, you need to get the camera and lights set correctly.

Line up your camera or your subjects in the scene.

It's always best to move the camera so that the subjects are in the proper locations and size for the background image. This means the characters can move about properly and as long as the field of view is the same as the in game camera, they will move around the scene as if they're there.

Position lights so that the character is properly lit,

This is vital. You need to look at the lighting in the background plate to see where the light is coming from, what colour it is, how strong it is and if it needs to be animated over time if needed to match the scene.

Getting the lighting right will sell a scene more than any other factor.

iClone has several types of lights. Direction, Spot and Point. It also has Shadow casters.

Once you've got the animation you need, the lights and the cameras, you are almost ready to render.

Depending on your compositing needs, You now may need to remove the background image or video and set a Green background (or Blue if green is used in the character). This is so that you can composite the scene later using within AfterEffects.

Set the render settings to the correct frame rate and resolution, add any other effects you want like Ambient Occlusion and render the video..

Import and Compositing

Once you have your finished video, import it into After Effects, overlay it onto the footage in your comp and apply the ChromaKey plugin (I use KeyLite ) This removes the Greenscreen and gives your video the transparent background. You may need to tweak the chromakey to reduce any artefacts from the lighting system in iClone but the end result should be a nice clean transparent background and your actors look like they're supposed to be in the scene.

Remember how I said the necks move even with no head in VR?

This is a major problem to solve.

It's almost impossible and often impractical to try to match the movements of the neck with the animations of your CGI head. So to solve this you need to animate a mask.

Or you use a still image - which is fine for short sequences, however the ambient animations to bring a little life to the character.

So this means you need to:

1. Track the neck. The neck is going to be moving around so unless you want your CGI head to look like it's floating in space, you need to track the neck movements at the top of the chest. This can be done by tracking a few points on the collar of the flightsuit. Once this is tracked, you apply the tracking data to a null object.

2. Parent the CGI head footage to the null object. This will mean that when the null object moves (because of the tracking done earlier) the CGI head will move too. Position the head in the correct place so that the base of the neck matches the top of the chest - If your camera set-up was done correctly, this will be very easy.

3. Apply a mask to your CGI head footage. This chops off the shoulders of your CGI character and means that the character is wearing the flightsuit.

Now the problem of the moving necks really creeps in.

4a. Animate the mask to match the movements of the neck. When the neck moves, keyframe the mask movements to mask out the neck or flightsuit correctly for the given frame. This can take a long time if there's a lot of movement.

4b. Replace the neck portion of the background footage with a still of the neck which isn't moving around and parent that with the null object too - depending on how well you do this, it can provide a static neck. This works for small collars like the flightsuit, but big collars like the flying jackets are more noticeable. Your mileage may vary.

Conclusion

There were a lot of shots in Bored, Bored, Bored! which needed a lot of planning to accomplish. Thankfully with iClone I was able to get a lot of work done, and in some cases I was able to get better results - faster than with previous tools. There are still technical hurdles to overcome though - but then, if it was easy, what would be the fun.

And on that note, Have fun and Fly Aimless

bottom of page