I haven’t posted anything on this site for a long time. The reason being, I’ve been too busy with research. So, I’ll post up details of what I’m doing soon, which is essentially creating a real-time interactive version of what I did for my Cymatics short film. It’s called Cymulator (Cymatics Simulator) and its in its first prototype version at the moment…details to follow soon! 🙂
This is most likely the last post for the course, hard to believe it’s almost over – this has been the quickest year of my life!
Anyway, the film is done – the new scene was inserted, a few of the fire scenes were re-rendered to try and highlight the reflections a little more (I created an alpha mask using the original footage to have the cracks in the paving stones as a mask for the light flashes and reflections). The edit has been rendered out as a 9GB (png) quicktime movie that I’ll create copies from as needed. I’m pretty happy with the final result and it has been an effort getting all the glyphs modelled, all the particle layers rendered, and finally editing it together – which itself has been pretty straight forward.
So, the remainder of the week is getting the presentation organised as well as some prep for the degree show including a poster, and some ‘making of’ images and a reel.
A week and a half left to hand in and everything is going according to plan. Most of the compositing was done with 2 weeks to spare, which was in keeping with the original schedule, so I’ve been doing the odd re-render to clean up a few things. I noticed that some of the 3840 particle renders that I had composited, didn’t look quite as good as some of the others, due to the fact that I had reformatted the size to HD before putting it together with all the other nodes – rather than having the whole scene as 3840 until the end of the pipeline and then reformatting before writing out the file. I didn’t think it would matter, but there seems to be a minor difference.
I also found that the purely CG scenes, the music notation in particular, seemd too ‘clean’ in comparison to the other shots, so I graded the colour to match the opening shot and added some grain to make it more like the camera shots. As for the molecules shot, I re-rendered the shot with z-depth and changed the flares slightly, so they didn’t look as ‘out of the box’ as the original preset did.
Tweaking the Edit
I haven’t needed to do too much for the edit, mainly because I’ve been replacing the scenes as they’ve come in. Recently, however, I have really been focussing on the timing of the glyph shapes with the music to be as accurate as possible. Clealy, I accounted for some minor timing issues as I key framed the whole piece way back in the beginning along with the collision particles and used them as a basis for the rest. The timing has been relatively good, however, so I’m quite happy with the small amount of lag here and there.
Titles, Credits and Additional Sound
I added in a few foley sounds to the mix for effect – a wave sound to match the beach scene, as well as changing the equalisation for that shot as well to make the sound more ‘tinny’ like it’s coming out of the headphones. I tried to mimic the idea of the ‘detuned’ sound you get when a car drives past, but it messed up the soundtrack too much so I settled for a car sound driving past to put it more into the scene. In addition, I added some fire sounds to match the bonfire and finally I created some additional new music for the credits – this amounts to a simple series of notes on a synthesiser (from the original melody) that matches the glyphs at the end. I re-used some of the glyphs from the film, but placing them on a black background and with some added godrays, they essentially feel like new shots – and combined with the music it has ended up being reminicent of something like ‘Close Encounters’, which I quite like, as it gives it an other-wordly feel, and it’s quite funny!
The scariest moment of the whole project is when you play what you have in front of your fellow classmates, especially if they know what they’re talking about. It’s a worthwhile process. Thankfully, there weren’t too many suggestions to changing the whole thing, but useful ideas concerning grading and colour, by and large. One shot that seemed to be out of place was the beach shot – the girl in the shot is wearing headphones and was originally designed to show sound bleeding out of them but unfortunately without the VFX they’re hard to see (the VFX didn’t look good), and I don’t know if it was exaggerated due to the addition of the ‘radio’ type equalisation I did on the sound to bring the audience out of the previous perspective. Also, the fact that the words ‘and sound’ seem to come in a scene after the previous text makes it a bit disjointed. So, I tried bringing in the glyphs again to the shot, but it doesn’t look good. I’m now toying with the idea of trying to do a last minute replacement shot, so I’ve hired out the Go-Pro with the prospect of trying a few things: possibly on the head of the guitar for a ‘down the neck’ shot; clamped to a pair of headphones, relating to the original idea somehow; or fixed to a little handheld radio that’s being carried around. We’ll see what tomorrow brings after a crash course in using a Go-Pro.
I have also put together most of the slides for next week’s presentation. I have had to look back at the original programme of study reports to check on the consistency and evolution of my project. Essentially, have I produced what I set put to produce in the beginning? That, and many other questions over the nature of the methods etc need to be addressed. This will be the last one, and it’s a bit longer than the previous presentations, but there’s plenty to talk about.
Well, it wouldn’t be a proper finish if I wasn’t trying to cram in something at the last minute! Last night I tried a few alternative versions with the GoPro – headphones on, radio shot etc. The headphones shot, which was my priority, turned out to make my head look too alien from the close angle to my head, so i’ve opted for a shot of the radio. There wasn’t too much effort on the initial composite as I’d just plugged in the new images to the old version of the beach shot:
Having said that, as the glyphs are now full frame, the original resolution wasn’t good enough, so I’ve had to re-render the whole scene for particles at double HD resolution and geometry at HD. I don’t have time to send the geometry through the render farm, so I’ve opted instead to use the sotware renderer at home and subtly apply the amount of geometry in the comp. I increased the amount of godrays in this one a little and it seems to work well.
I’m now getting to the stage where I have only 3 scenes left to composite. Having said that, I will, time permitting, redo a few of them to clean up the odd issue. Most of them could probably do with a re-render depending on the issues each of them have. Clearly, though, time is a major issue and I need to draw the line under what I think is acceptable and what I think I can improve on in a very short space of time. Our final films will be shown at the DCA – so it will be great to see them on the big screen, but the downside is that any minor faults will be amplified somewhat! So, I need to do my best obviously!
This also week sees ‘Concerning dragons’ on the BIG screen in City Square, so that should be fun.
Godrays and Lightwrap
Last week I forgot to throw in a picture of lightwrap in operation so here we are (this is the end of someone’s foot!):
The idea is that you can take the light from the background B and apply it to the foreground A. It can be blurred etc, and it depends on the intensity and how diffuse you make it. Good for covering up roto problems and placing in roto’d people into the scene more convincingly.
Godrays is a node that allows the colour of your scene/node to be splayed outward depending on where your place the centre point and transform point. It’s easy to get carried away with this one, so I’ve done my best to be subtle about it and enhance the glyphs by first rendering out a separate pass with just them, and applying it over the original ones to give it a slight blur but also the impression of bursting out from the source:
Editing in Adobe Premiere
I had done a rough edit before semester 3 with the storyboards, then replaced that with a version using the footage without VFX. So, now I have been replacing the shots as they come in to see how it’s shaping up and what needs to be done. All seems to be going well apart from an unknown issue that keeps crashing the program, when I try and render out the scenes as they are. At the moment I’m tryng to suss out what the problem is. I’m going to try and render out each scene as a separate movie clip, maybe a png or something, and try again. To be continued…
…The ‘png’ file conversion seems to be working, as I converted all the tif sequences into mov’s with a png codec, and Premiere seems to be running more smoothly. I can only assume that it can’t handle too many tif sequences at once.
All of the scenes have been rendered out, and I’ve put a rough edit together to see how it runs. It looks pretty good, although it’s hard to see it with fresh eyes as I’ve been focussed on the minutae for so long. Some of the scenes have a few little problems like: some glyph transforms needs put right as they run off too quickly; a few additional roto masks to brighten up the church roof or tone down the light coming in the windows; the fake flash lighting in one shot needs the roto sorted out. Anyway, small issues really. I don’t think at this stage any major changes can or will be forthcoming.
Titles and Credits
I have begun creating initial ideas for the title sequence – so far, I’m using an extended version of the scene where I’m writing the musical notation but this time with the text ‘Holographic Music: An Interpretation of Cymatics’ as the title. I have created 3 versions – one red, green and blue – which slip over the other to create a kind of chromatic abberation, till a little godrays and animation has the text fly out and fade subtly. I don’t want anything too flashy, as I’d like the film to start simply and build to a crescendo of sorts at the end. For the credits, I plan to use some cymatic images over a black background with an extended simple melody that will accompany the glyphs. (Images to follow)
It’s now getting to the stage where I’ve almost completed the first round of passes in Nuke. What I mean by that is that I have a version of each scene, some in more need of repair than others, in terms of additional rotoing, lighting and general tidying up. One thing that annoys me on a daily basis is the lack of memory on the computers in Uni that i’m using – not that I can really complain – but what seemed like a lot of memory (100GB!) a while back, is now a case of juggling things between drives to free space on a daily basis. It’s largely down to the amount of images that I have rendered.
Some of the shots I expected to be simple have turned out to be problematic for various reasons:
- the last shot at queens view is shaky due to the zoomed in nature of it, and thus the glyphs over the top need to be matched for shakiness. So, I added a tracker and used a transform matchmove node to match the movement. I also had to key frame 2 separate transform nodes as the roto and glyphs scaled and transformed as well:
- In addition, I had to reimport the original footage again, as the colour graded footage I created in After Effects, for some reason is unusable – it jumps randomly from 1 frame to another, so I’ll colour correct it in Nuke
- a similar problem existed for the drive past scene where the graded footage seemed corrupted and looked interleaved – is there some kind of corporate battle between Nuke and AE to prevent me using both?!
To make the VFX sit in the scene in a more believable fashion, I had rendered out the light line passes, and they have helped. But more is needed! So, for the shots within the church in particular, I have begun rendering out tif’s for purely the glyphs in the scene. The idea is that I’m going to flip them over and blur them to be used as reflections in the floor. This can be done pretty easily by using a transform node and scaling the H axis as -1. Once blurred, the only thing left is to mask out the area furter away from the reflections by using roto techniques and feathering. I’ll add images shortly.
I’d forgotten to mention that last week I had started to incorporate depth passes into the pipeline as well. I’m only using them subtly, as much of the footage is mainly in focus and the demand for depth is minimal. I am aware of overdoing depth, but it can help to slightly blur out the foreground or background of the shot:
Here’s a shorthand bullet pointed way to create pretty good flares in Nuke:
- Track footage
- Create first flare – ‘burst’
- Merge using plus operation
- Hold CTRL and LMB drag from tracker x y position to x y position in flare node
- Presets –
- many bright – multi tab – asymmetry – repeat controls how many points on the star
- falloff – how sharp they are
- Ring color – colour of the flare itself
- Size mult – size of the flare
- Create a second flare – ‘streak’ – plug it into the first (drag the arrow from the first into it!)
- Alt + E to turn off expression links (green lines!)
- Use ‘bright’ preset
- Change colour inner ring to blue/white
- Colour shifts – chroma spread etc control chromatic aberration
- Size – radius to make inner/outer bigger etc
- Set anamorph to 7/10 (type in numbers!)
- Bring down size mult
- Bring down brightness
- Add 3rd flare – multi penta (for additional streaks) – this moves with the camera, so don’t need to copy the position
- Change ring colour and inner colour to blue
- Go darker on the ring colour to feather out the edges – then bring up outer falloff so it’s feathered
- Anamorph to around 10
- Do copy animation expression for position!
- Add a 4th flare – ‘magenta’ – multi penta again
- Decrease multiple flares to 4
- Use random offset to push them closer together
- Flare tab – up the size – and offset
- Size in the multi tab can vary the size in all of them
- Shape – corners 0 (flare tab)
- Change ring and inner colour to magenta/red
- Change chroma spread and shift to suit
- Size of flare doesn’t really change, but brightness does depending on distance
- Use multiply node to control brightness of all flares
Additional ambient lighting was going to be a problem – I tried doing it in Maya with the proxy collision objects as the surface to light, but it was going to be too time consuming to tidy up the geometry to make it accurate enough. It was only designed, after all, for the particle collsions. So, I had to fake it in Nuke! One solution which seems to work as a kind of ‘flash’ lighting effect, was to use a combination of a radial ramp with a circular section roto’d out. I then keyed in the flashes to coincide with the glyphs:
So, with the geometry and depth layers coming in as the render farm works through the files, I am focussing on compositing with Nuke. The main nodes I seem to be using so far include: read/write, merge, unpremultiply, premultiply, colour correct, grade, glow, blur, copy, and as of yesterday lightwrap. Lightwrap is a useful way of bringing in light around the edges of the foreground node using the background colour to bleed into it. This is good for covering up edges etc. Having said that, there are a couple of shots that will require redoing, including the beach shot where the roto is too rough.
- F – to frame the selected node or backdrop – this is much quicker than using the mouse wheel to zoom in and out
- D – this is an easy ‘disable’ or ‘enable’ shortcut for turning a node on or off
- Copy and paste work as usual!
I managed to render all of the light line shots at home over the weekend to save sending more through the far – these were created using an inverse ambient occlusion pass and composited using the ‘plus’ operation on the merge node:
The static shot from the back of the church has caused a problem, in that the central point of all the glyphs don’t match where the speaker is on the boom box. I don’t know why – anyway, I got around the problem by addign a transform node and slightly scaling up the background to match the glyphs./ The only problem is that the light lines are out of scale with the back of the wall. So, in work around fashion, I have masked out the top overlapping part of the light lines only for the section that is obvious, using a black rectangle:
Model Problems – When putting together the geometry renders, there was one glyph that didn’t align – the G1 polygonal particle goal was distorted – it seems that I had made some error way back in the construction of that layer. I missed it back then and only noticed that there was a problem now. So, basically I had to rebuild the polygons from the original nurbs to save time overall, and subsequently only have to re-render the poly goal particles for the 5 scenes that it appears in, rather than all of the nurbs and geometry again. One disappointing factor is that there was a beautiful colour ramp that seemed to work really well with the original, but flawed, version. I have to content myself with righting one wrong and losing one thing for the sake of the whole.
Today, Sean took us for one last class relating to Maya and Nuke, although he will be around to consult on the upcoming Fridays. Most of the Maya material he was covering was valid, but ultimately pointless for most of us as we’re already well-established on our respective roads. The Nuke material was a bit more relevant, discussing the ways to use the ZDefocus, shuffle and colour correct amongst many of the nodes I’m familiar with. I must remember and add grain to my composites as well as some, at times, overused flares!
Just to update, here is the current template for comping each glyph:
I have also decided to colour correct the collision particles to a light blue, away from the original orange, as it seems to sit in better with the general colour scheme of the glyphs as a whole:
Cymatics Text Shot
I wasn’t happy with the 3D track of the text shot – I had tried a couple of times to re-render and even tried to keyframe out some of the jerkiness of tha shot, but I wasn’t happy. So, I have tried another 3D track of the shot and I have rendered it. It’s much better second time around thankfully – job done on this shot.
The molecules shot is done now as well – I tried a couple of different versions of text which I wasn’t happy with, including text from photoshop – I settled on using the same font that I used in the cymatics text shot (Adobe Garamond Pro) for consistency. In Nuke, I used two separate layers of text, slightly misaligned, with the foreground text being a light shade of blue, and the background one being dark grey. This lets the viewer still read the text even though it sits in front of the central light, created from a radial node with glow, and a flare added for some ‘movie magic’!?
I’ve been trying to get the timing and camera movement of the molecules shot right. To show the molecules vibrating, read the text and allow the viewer to appreciate that it creates the glyph shapes, is difficult to do within a short space of time. I don’t actually like the way the text looks here, so I’m going to remove it and add it in the comp. I’ve also stretched out the length of the shot to 8 seconds – i’ll need to decide how i’m going to fit it in when it comes to the composite.
Depth Passes and Rendering
I’ve realised that I only need to make one depth pass by copying the master render layer and adding the appropriate material etc. For some reason I had been messing around with each glyph, so it makes life a bit easier hopefully. I haven’t as yet sent off the geometry or the depth passes for render, that will happen before the end of the week. Here is a sample of the geometry with the soap bubble shader attached:
As far as the rest, I still have 4 shots to re-render in double HD. Some of the original shots should be good enough for HD 1080 as they don’t take up as much of the frame and are obstructed etc by other objects. Anyway, at the end of this week we’re going to have a little Nuke session with Sean Yu, and that will set me up nicely to get on with putting the passes together for the shots.
I started putting some of the particle layers together to see how they were shaping up, minus the geometry thus far. I also wanted to check how to reformat the double HD sized images easily back to HD. It turns out that Nuke has a reformat node that lets you do this easily.
Interms of the comp test, I was having a look to see if there would be some kind of template that I could put together for all of the shots. So far, It’s just a tree of merge nodes, with colour correct, glow etc with the different passes feeding in to them. Because each of the glyphs need more of something and less of something else, I don’t think there is a ‘one size fits all’ scenario unfortunately. I need to experiement with grading on an individual basis shot by shot. This may change as time passes!
Finally, I added the reformatted particles in with the Tiffs from the static church shot – the only issue is the misalignment of the central point of the glyphs and the speakers of the boom box. I can prpbably rescale the Tiff’s slightly here to match, but hopefully the other shots won’t have the same problem. It’s looking good though…
In order to make the glyphs sit in the scene better, I thought it might be nice to have the shapes create some kind of light when it intersects with the surrounding scene. So, how to do this? I remember seeing a tutorial a while ago that used inverted ambient occlusion passes to create a kind of laser scan effect, so I thought that I could use the exisiting collsion objects within the scene and the glyphs themselves, and create a series of ‘light lines’ where the geometry intersects. Here is the basic set up to get it to work:
- create a lambert material
- apply it to the collision objects in the scene
- name the lambert LightLineMaterial
- in the attributes, set the colour to black; the transparency to white; and diffuse to 0
- in the incandescence map channel, add in mib_ambient_occlusion texture
- set the bright value to black
- set the dar value to a light colour – in my case light blue
- max distance (determines the size of the line) 0.1
- Attributes for the collision polys – render stats – turn off cast/receive shadows
- Renderer – Mental Ray – FG turned on, 300 accuracey
- Add a transparent lambert to the glyphs – turn off cast/receive shadows
(Pics to follow)
Musical Notation and Cymatics Text Shot
I have updated the musical notation shot, ready to render, with the notes popping out in time with the music now.
The cymatics text shot, I had rendered and started the compositing process. However, the text was moving around in a rather random and unappealing fashion, and I realised that when I set up the shot, the text was too close to the tracked camera, making the slightest movement by the camera much exaggerated. So, I have decided to redo it with the text much further away from the camera. Second time lucky?
Particles and Rendering
All of the particles are now rendered (hopefully), so it’s full steam ahead with the compositing. I now have to figure out how to use Nuke in a semi-professional way. First up, unpremultiply and premultiply nodes! It should be remembered that you need to add in the unpremultiply node before colour correction and grading before sandwiching it with the premultiply node. This is to prevent the multiplication of coour with the alpha channel which can cause issues! The only point to note here, and I may be wrong, but it seems that this does not count for glows which shold be added after:
So the first rounds of renders are done, but as I expected, I have begun to realise a few issues that will haunt me unless I rectify them. First of all, I only recently realised that I could render out at double HD (and reduce the size later) without out too much additional work from the processor. Basically once the cache has been completed, the render size doesn’t make a huge amount of difference when you’re using the hardware renderer. I had looked at the results of some of the renders and realised that they looked better in the viewport rather than the final render. So, I tried double HD size and the difference is noticeable. So some, but not all, of the shots need re-done on that front.
Secondly, for some reason I have missed the fact that the collision objects need to be colliding with the actual glyphs themselves, rather than just the collision particles. I had previously thought that some additional rotoscoping would sort that out, but clearly that will look cheap and will not provide a suitable 3D way to cut out the appropriate parts of the glyphs that intersect with walls and floors etc. The same also goes for the boom box itself – it needs to be taken into account as a collision objects here. It’s something that I can get away with a little more, but nonetheless I want it to look as good as possible, so re-rendering is the name of the game here.
There seem to be many ways to achieve this in Maya. I was looking for a relatively simple way to do it and this seems to work for me:
- Choose maya software renderer
- Highest quality anti aliasing
- Turn on use multi pixel filter – box filter?
- Don’t need ray tracing turned on
- Going to throw the sequence away once we’ve extracted the z depth
- Save as maya IFF (carry z depth)
- Tiff’s or TGA do not carry z depth
- Turn on alpha and z depth beside camera
- Need to apply a lambert on everything so it renders fast
- RMB on render layer – overrides – new material override – lambert
- Need the material rather than just a surface shader
- Batch render dialog – can set the number of processors manually if you like
- Open up in fcheck – the diffuse rendering is blown out by the lights – the z buffer should show depth – then choose file – save animation – can save as a targa image
- Can take this into Nuke now
Using the Background Shader
NB – when using the background shader:
- Specular color black
- Reflectivity 0
- Reflection limit 0
Otherwise you’ll reflections in the colour and alpha channels when you render out the geometry!
Rendering the particles has been a bit of a shepherding experience – each of the glyphs has had its own set of problems. For example, the G major glyph seems to have caused some major slowing down of my system to the extent that I have had to break up the glyph further into separate nurbs particle, poly particle and shell particle scenes just to be able to render them out. Most of these scenes have demanded that I cache out the particles beforehand, so the whole job of getting the particles rendered is an onerous process.
On the other hand, there is no way that I would have been able to run these thought the render farm at University – the caches for the particles would have been enormous – plus the fact that if something went wrong during the render, I wouldn’t be able to stop it and fix the problem before resuming, it could have been a disaster! Hopefully, I will be finished rendering the particles out by the end of this week, then I can look to rendering the glyph geometry through the render farm.
Particle Settings – for my own future reference – I have been setting the playback speed to 0.2 (play each frame max playback 25fps) with the over sample rate set to 5. This was recommended by the fireworks tutorials I had gone through before for smoother particle results.
Mental Ray with Z-Depth
7 weeks to go and I continue with setting up the collision particles as the initial key framing tool, caching them and then rendering them out as they don’t take too long to do.
Once completed I import in the glyphs that I need in that particular scene and key frame an appropriate scale for them based on the collision keys.
Some of the issues have included:
- memory running out surprisingly! When I cache some of the collision particles, they are taking up over 100GB worth of space, so reorganising of files and deletion immediately after use is the order of the day. I have had to cut down some of the emitter rates as well, for example, the ‘D’ glyph cache in one shot stopped at 233 GB as my machine ran out of memory (and it was far from finished), so I had to significantly reduce the emitter rate to accomodate the cache. The down side is that the render won’t look as good, but I’ll need to compensate for that in the composite.
- missing the deletion of all history of the poly or nurbs shapes during modelling, consequently some of the scaling has needed tweaked
- when I try to create a collision event, using the particle collision event editor, the program crashes. I believe that the collision particles are somehow still connected to a cache, so if I disable and delete the cache, sometimes it works. Alternatively, rebuild the emitter from scratch and it works – maybe start off with lower emission rates
- in terms of rendering the particles, I have tried out various versions to see which is the most efficient – as I have probably mentioned before, the best way seems to be to break the scene back up into its component glyphs and render them out separately once they’ve been scaled etc. So, just delete eveything in the scene you don’t need and render out the particles you do – otherwise it slows the system down to an inoperable bottleneck of particles!
Colour Grading in After Effects
I had decided a while back to colour grade the shots of the car and the final shot over the loch, to make them seem at dusk or the transition into night at least. I had found a decent tutorial on video co-pilot for a After Effects day to night conversion type shot. Regardless of the fact that I have been using Nuke, I thought it would be more efficient to follow the tutorial and render the frames back into Tiff’s for use in Nuke but with the colour grades added.
Generally speaking, I added a hue/saturation effect, curves and a mask (on a duplicate layer), which I know you can also do in Nuke but which I haven’t yet the experience of. The tutorial goes on to add car lights etc, but I will attempt to do that in Nuke as i’m more familiar with tracking now and will feel more comfortable doing it there. In the meantime, however, I have the same shots but re-graded to make them appear more ‘dusky’!
Air Molecules Shot
To take the molecules shot somewhere, I decided I needed to show how the molecules bump together to form the Cymatic shape through a process of internal diffraction. In order to get the idea across, I thought I would use one of the pre-modelled glyphs as a goal for the ten different atoms and molecules that makes up air. To do this, I thought I would try and utilise Mash, the procedural animation toolkit that I have looked at in the past, but never quite needed to use. This plug-in basically allows me to multiply and distribute the separate atoms/molecules over the surface of the glyph in a random manner. Clearly, I could do this via conventional methods in Maya as well.
Anyway, I added a little random movement to the ‘X distance’ which is basically the amplitude of the distance along the normal, which gives the molecules a little shake on the surface of the glyph. So, the plan is to start off in close and pull the camera out to show the glyph. From there I’ll zoom out again to show the glyph represented by particles to take us back into the next scene. This will happen very quickly, hopefully it will make sense! To be continued…