.comment-link {margin-left:.6em;}

blimps are cool

Saturday, January 29

Jinn for Myth 1.5

Seems someone is working on porting Jinn to Myth 1.5.

Info can be found here.

Personally, I have no problems with what theyre doing but hey, I was only a lowly composer/sound designer/tester/voiceover/writer dude.

Friday, January 28

Spawn Project Folders

Some time late last year, I decided to code a small aApplescript which automatically generates a folder structure for our projects so we can keep all our outputs, assets and project files together.

I've now decided to put that script online. Its proven to be the best workflow solution I've developed, even if its remarkably simple. I just run the script every time I start a new project.

You can download it here. [Its an OS X zip, so it still won't work on peecee] I take no responsibility for stuff. Just make sure you don't have a folder called 'new project' or the script will freak.

UPDATE: I should clarify that this folder structure is geared to a FCP based workflow...
Hope someone finds it useful.

Thursday, January 27

Infinite Recursive (aka its turtles all the way down)

Epithet for Today:


Geek is just a name for someone who knows what they're talking about.


(I made that up! Cool, huh?)

Speaking of Geeks:

David Starkoff has linked to me linking to him, so I'm now linking to him linking to me for linking to him. I realise it could get silly, but seriously, this blog has been around for nearly 18 months, and in the last week its actually getting some attention. What's sad though is that actually makes me happy.

Regardless, David's blog is very good e.g. He wrote some amusing commentary on Lessig's visit to Australia. I didn't even realise he was coming to Australia!

David almost makes you believe he [as in David, not Lessig, but Lessig suffers the same problem.] is human, but then occasionally and unfortunately reminds you that he is in actual fact a lawyer. David [and Lessig] usually do this by by talking about cases with a kind of glib glee. Humans do not react to cases with glibness or glee, yet alone both.

Gabriel's Day - If - Music Video shot on Varicam

I've finally gotten around to making a new encode of the video clip I directed for Gabriel's Day. I had done one previous which I posted on MVWire and 2-Pop. The previous version had a bit of a gamma shift problem, namely it went too bright and bleached all the detail away.

You can download the clip here - its 63meg or so... and our webserver ain't THAT fast, a 1Mb pipe I believe.

For those interested, it was shot on a Varicam finishing back to 25P. We shot at 30FPS, 40FPS and 60FPS. According to Rexel, who are the local distributed of the Varicam, we had to shoot at a system cycle of 29.97 for variframes to work with FCP. I live in a PAL country, so shooting 29.97 is a bit weird. We captured using the 1200A via firewire, using all the workflow processes which have become standard. It took me a while to get it working, as I had to set the deck to 29.97 for it to playback properly.


After loading the footage, and running frame-rate converter, I found that the Varicam footage was rather noisy - noiser than the SDX900 (a camera which I love - and the light-falls offs rather bandy. On hindsight I think it was because we pushed the 'dynamic range compression' too much (set it to 500%), and gave too much of our bit-budget to the highlights when we should have given them to the shadows. That said, the Varicam may be a 100megabit camera, but it shoots 60 progressive images a second. In effect, it uses 50megabits per frame the same as DV50 but spreads that bit-budget over a 720P image rather than a 480P image... of course its going to be more lossy. Anyway.

I also had plenty of gamma problems. While I did the main edit in DVCPROHD, I finished the video in Blackmagic's 10bit UC codec... and that introduced a bit of a gamma shift. I eventually discovered that I needed to shift the gamma by 1.2 for it to look correct on my production monitor (a properly calibrated 20" Panasonic 800 line beast). All colour correction was done on the DVCProHD timeline, but I added 'tweaks', such as the vignette and letterboxing when I downconverted it to an SD frame. I found I could do frame extractions of around 40%, ie scale by 140%, without noticeable artifiacting. That allowed me to create nice buried zooms, of which I have grown rather fond.

Most significantly, I also had to do the entire edit in a 23.97 timeline and slow down the song to make it work. For those who don't know, I'm in Australia and we broadcast at base 25. I came up with a solution before we even started shooting....

Once the edit was locked off, I exported a TIFF image sequence out of FCP. I then loaded the image sequence in AE as a *25FPS* sequence, not a 24P. That way I got a frame-for-frame matchback, which is what I wanted. Speeding up the footage in FCP from 23.97 to 25P introduced a whole lot of motion artificating, particularly as it had to do 'field blends' to create frames. I lost the whole 'film look' of the progressive scan, which was just stupid. A proper frame-for-frame match is far more preferable. I then just rendered the whole thing into a 10bit Blackmagic UC movie which I dumped back into FCP with the appropriate IDents. Unfortunately, I had to spend some time working out how much to speed up my audio to in order to get sync. You see, I screwed up when I slowed the audio down earlier in the process. I did my maths on the basis that it was being slowed down from 25fps to 24fps, not 25fps to 23.97fps. ARGH! Anyway, it ended up being around 1% faster or something ridiculous.

When it came to outputting to tape, I found I made another mistake. I accidentally rendered the footage into the Blackmagic DV10 codec, which was their variation for digital Voodoo! I couldn't actually run out via SDI. Fucken. I had to re-export the whole thing from AE again. Ah well. Least I got it right eventually.

... and here we are.

The clip itself I'm quite happy with. It was written-produced-and-posted in around 5 weeks. I think it suffers a little from being set all in the one location and my emphasis on narrative rather than eye-candy. I also could have done with moving away from my usual Clara Law inspired coveraged, and used more handheld! In other words, I could have created more of an arc for the visuals and not just the performances... which are actually quite good, given we didn't end up casting the male lead proper until the 1st day of shooting! Gotta love that.

Props to the crew. The next clip is going to be on 2-perf 35mm I think... IN some ways, it'll be much EASIER than dealing with the Varicam. However, it won't be anywhere as cheap, partly because Lemac did us a great deal on the Varicam, sticks, deck and HD-monitor.

UPDATE: In the past, some people have asked me about the 'projector stuff' and how we achieved that look. Pretty simply, actually. We shot the band in front of a 12x12 silk, which was backlight by HMIs (which I was happy to let flicker). Alexis then lit the band with a kino or two and a 1K spot (with 216 diffusion). On the split, the image was pretty... ah... crap. This was a post job, but Alexis knew that and gave me what I needed - good lighting ratios. He even pulled out his light-meter to check em!

In post, I pulled a luma-key on the highlights - removing the white silk. I de-saturated the resulting image, pushed its contrast, then gave it a slight tint. That was then composited on the backplate of the projector which we had shot during the shoot. We shot a range of plates actually, plates are VERY useful.

I stumbled across the idea of the band concurrently in the background when I was doing one of the comps. When editing, I tend to lay down my master tracks - in this case a wide of the band - as the very bottom track then 'build up' my cuts. This allows me to experiment with different coverage combinations easily, as I can just 'enable' and 'disable' certain clips to try different shot combinations. I didn't turn off the master-shot while I was compositing one of the CUs. When I looked at the comp, I thought 'wow that looks wicked'. So that was a happy accident. But the actual idea of the look, however, came to me when I was in a lecture on evidence law and we were examining overheads of grainy photos - I was like 'now *thats an aesthetic'. Pretty easy to imagine how I was going to pull it off too. As I've alluded to earlier, its an important part my creative process to understand my tools to comprehend the entire thing from shooting to post to final product and everything in between.

Any comments most welcome.

Partial Credits List

[Apologies to those I've missed and whose name's I've misspelt]

Gabriel's Day are
Scott Mesiti (Bass/Vocals)
Daniel Simmons (Guitar/Vocals
Linzi Steele (Drums)

Cast:
Jo Briant - Marla
Pete Burges - Steve
... and a bunch of extras [really sorry guys! you were champs, even if I can't remember your names]

Crew:
Producer/Director: Stu Willis
1st AD: Antigone Garner

Director of Photography - Alexis Tarren
Camera Operator: James Fenton
1st AC (briefly): Kim Sargenius
Gaffer: Peter Marsden
Best Boy: Jamie Nimmo
Sparkie: [Some guy who I forget but was really sarcastic the whole shoot]

Art Directors: Julian Hill & Kush Badhwar
Costume Design: Rachael Cassar
Makeup/Hair: Andy Dawe & Shevaun Robertson

Post: Stu Willis

Me, about 4 years ago, when I used to be a radio presenter at my Uni's radio station (2SER FM):



Nat, my co-host at the time, does not look impressed!

Hmm. Vanity!

Currently playing in iTunes: Cloudburst - Jon Hendricks

Oh wait, it just changed:

Currently playing in iTunes: Mumbles - Clark Terry

Oh crap, and again:

Currently playing in iTunes: Knucklehead by Grover Washington Jr

[MarsEdit is too much fun!]

Color correction in linear vs. gamma corrected space

Stu Maschwitz of The Orphanage has finally updated his blog with a very interesting (but short) article on


... how color simple corrections behave differently in linear floating point space.


He goes on to conclude that

It's not that you can't ever use gamma correction in linear floating point — you just have to be careful. I've found that in linear space, gamma should be the last thing you adjust, not the first, and this is very different than most peoples' current experience in clipped vid space.


Its totally worth reading if you're interested in colour-correction, but its pretty heavy stuff. I haven't read Brinkmann completely yet, so I'm a bit lost. Need to read again.

UPDATE: Wow. I got linked on the Phila Final Cut Pro Users Group blog! Thanks Mike! Specifically, greeny comments:


What they're both lamenting is a real Color Correction workflow. Oh, FCP has a quick Color Corrector; it's adequate for 95% of the people, but it's not a full color correction interface.


I should posting more analaysis of workflow issues. I still think WorkFlow Films is a kinda neat name for a production co, considering how obssessed with it I am becoming... hmm.

Hi Chloe


You can tell how much money the film is going to make by how it does on the first weekend. The whole culture is in the craphouse. It's not just true in the movies, it's also true in the theatre.

Broadway, and now London is the same, special effects are in great demand. It's not a good time culturally.


-- Dustin Hoffman, quoted in the Guardian

(There isn't actually that much else he says in the article, but you want want to follow it cause the Guardian is a good paper, even if it is a den of left-wing lunatics)

Wednesday, January 26

Core Your Image in Half Floating Whatsit

Wow. What an update.

My good friend, the ubergenius Alex Fry, happens to be a very very good Shake op. He read the conversation between Mike and I and offered the following comments:

"ok.. i can maybe clear up a few things
core image definitely support at least 16bit half float in hardware
there was mention of 32 bit float relating to the windowing system, but how thats translates to CI and wether thats hardware or software i dont know
as for previewing 16bit images on 8bit monitors..
its usually not a problem... most banding issues are a side effect of pushing that limited 8bit data around.. and the rounding up and down of values that this causes.
as for shake being a usuable DI tool
its would requre a massive overhaul of the way shake's timeline works.
as it currently stands its complete crap for working with more than 1 shot
especially if there is any chance the edit might change..
none of the roto tools for example, can have their keyframes slipped in time"


When I told him I wanted to post the comment, he wanted to hunt up his references. The 32bit floating stuff came from the 'State of the Union' address by Apple; but he's not sure where he heard about 16bit half-float. But to me that's just a logical extension. If you're going to support 32bit floating point, why not do 16bit half float? The latter is the one that is becoming the industry standard.

P.S. I got 'my' iPod back from repair. Basically they replaced it with what looks to be a brand spanking new 3G Pod. Awesome! It even has the blue backlight, which my first run 3G never did, and they even had it up to date software wise. Apple rules u.

Student Ownership of IP (Updated)

You may have remembered the 'blogwar' which errupted late last year regarding (film) School Ownership of IP. Well, very shortly after the same debate emerged on CML and I decided to chip in. I had learnt a few new facts regarding VCA's claim to ownership (essentially they argue they are producers) that I hadn't posted here, and there is a lot of discussion about the American situation and how it alters because of 'work-for-hire' (NB: Which does not exist in Austraila).

Anyway, a while ago it was posted up on their main archival discussion area and you can read the entire thread here.

Doesn't Bittorrent + Airport = Wireless Distribution?

Wired writes about the first wirelessly distributed film:


PARK CITY, Utah -- It was a film without film, a movie without moving parts. The premiere of Rize that took place last Saturday at a ski lodge here was a historic event -- the first feature film to be delivered via wireless internet technology.


Good article. It address the tech and, more importantly, the implications of the tech. I agree with them that wireless distribution has a significant future in film distribution.

Specifically, it will allow global roll out - which will be an important step combating piracy. Not that I think that the film industry will go that way, necessarily. But certainly, there are films I have pirated with BT because there was *no* release date here. Or, in the case of Spartan, when it finally came out it lasted all of two weeks. Worse, they released Spartan on DVD in a pan-and-scanned 16x9 version rather than the original 2.35. Just because its 16x9 doesn't mean some studio knob hasn't butchered it. Butchering is what studio knobs do :)

It will also, hopefully, lower distribution costs for indies. No longer will one have to strike a 35mm print, insure it, and ship it around to a few of the small cinemas and festivals.

Honestly, if I were the AFC/FFC - and we all wish I was - I'd be rolling our digital projects, as per England, and planning to scale a few key 'digital cinemas' (places like the Chauvel, Dendy etc) with wireless receivers when teh tech gets safe. That way, local Aussie films can be quickly and easily distributed to a number of key sites without paying the costs of shipping prints! Moreover, films which do not fit the normal mould of the feature can be screened more easily. I'm thinking specifically of short features, short longs, long shorts, and shorts. You could have an entire funding sceheme, which they already do partly, pushing filmmakers to experiment with digital production. I wouldn't be cutting the funding either (its $90K of funding for a short shot on digital; but $150k for a short shot on film). I'd be showing how much further you can push the dollar with this tech. Then I'd be distributing the product digitally. No-one goes see these films anyway, so why pay for the 35mm prints? :) Also means you can strike a HDTV master more easily.... and you could grade the whole thing on Final Touch HD! Ahehehehe.

But it does mean we need to get a few more alternative HD formats here. THere's the Varicam and the HDCAM. I believe Panavision have been showing their Genesis camera-system around town though, and its a 4:4:4 camera... with dual-link 4:4:4 out and single-link 4:2:2 HDSDI... which means you just need a D5 deck or an SR deck. Hopefully someone might important Arri's HD offering... but there's no market currently. But thats what I want to change.

Anyway. Go read the article.

Tuesday, January 25

Why Desktop DI Matters [Warning: RANTY]

Woohoo! I'm famous. Mike Curtis of hdforindies.com has blogged the posting of the conservation between him and I regarding the future-of-shake. I expect a few new vistors to the main part of this site, so I figured I should give em something worthwhile to read.

Briefly, I think they should check out the link I posted concerning Apple's buy out of Shake back in 2002. Its a good reminder of how quickly these things change and how often commentators (self included) get it wrong.

Speaking of comments, I posted some stuff on Final Touch HD. While I was perhaps overly enthusiastic, it comes from my perception of the problems facing the Australian Film Industry (aka government-funded-hobby).

Which brings me to what I want to talk about now: Why Desktop DI Matters. This is going to be more stream of consciousness than any kind of proper essay. I apologise for that, but I've been writing essays for the last decade, and I kinda can't be bothered any more. It may also end up pretty short. We'll see.

Its easy to dismiss the notion of 'desktop DI' as technolust or, worse, the dream of control freak directors. This may be partly true. I'm certainly guilty of wanting faster, cheaper, better. But without 'faster, cheaper, better' we'd still be cutting on Steinbeck's. That said, we shouldn't see the shift from the actual cutting of film to NLEs as one without creative consequence. Tools are the fundamental part of any creative medium - if the word 'medium' didn't tip you off to that fact already. You paint with brushes, draw with pencils, and photograph with cameras. Replace a stills film camera with a motion film camera and you have a new medium. The shift gradual shift from black and white to technicolour; from film to HD; from steinbeck's to NLEs... these are changing the form. Its no different from the shift from say, Harpsicords to Pianofortes. It changes the form, however gradually, however subtely. The pianoforte empowered composers with a greater range of expression for their music but it wasn't a carbon-copy replacement for the harpsicord. Likewise, the NLE has empowered editors and directors. They are more free to experiment with shot combinations. When finishing to tape, they don't have to heed the limitations of optical effects or negcutting. But some feel there has been some loss too. The over-reliance on production monitors for preview has -changed- pacing of movies. There is greater reliance on 'TV' style coverage - closeups and more frequent cutting. Scenes don't breathe as well on the big screen as they do on TV. Effects work also suffers when previewing on the small screen.

What about sound design? Cutting mag-tape is a friggin' pain in the ass. What made th work of Walter Murch and Ben Burtt so astonishing in the 70s was that they pushed the boundaries of sound technically and creatively. Listen to THX-1138 and compare it to other films of the early to mid 70s... its simply astonishing. But what Ben and Walter accomplished in the 1970s is now available to every sound designer with ProTools. The ONLY limitation to the power of sound in film is the aristry of the director and sound designers. Of course, sound designers want more out of ProTools (more tracks, more real time, etc)... but the paradigm shift pioneered by Lucasfilm has stayed. Generally, modern films have better sound design (imnsho) than they did 30 years ago. By better, I mean capable of reflecting and enhacing the storytelling.

D.I. is causing a similar revolution in the higher-end of film production. But what is curious about DI is that it offers to film DOPs techniques which have been available to those finishing-to-tape for some years.... It IS changing the face of cinematography on an artistic level, because it empowers DOPs in very new and exciting ways. A film like Hero which uses colour as a fundamental component of its story, on a metaphysical and narrative level, would simply not have been possible without DI.

For me, however, what I find exciting about DI is that I *understand* it. Sure, there are crazy issues to consider, like asset management and workflow... but they're logistical and can be solved. Creatively, DI is effectively what I have been doing since I first started playing around with Photoshop. Its a core part of my workflow. When I conceive of a 'look' for a project, I conceive it holistically from shooting to grading. When shooting a scene, I can consider what can (and can't) be corrected digitally. I don't think of it as as cheating. Its a tool which allows me to communicate what I want to communicate better; its not a dickmeasuring contest. If it where, then light meters of all kinds would be cheating!

Yet, while I may theoretically understanding the film post process (film acquisition, film cutting, film finishing), its only theoretically. That kinda scares me. Sure, if someone asked me to direct a feature tomorrow that was to be a 100% film post process (its quite cheap!) I wouldn't hesitate to say yes. But thats not going to happen. I have accepted that when I get around to making a feature, it'll likely be a low budget affair on my own terms. That means I *need* (from a psychological and creative level) to understand the process if I'm going to exploit it to its capacity. I don't feel I can do that with a 100% film process. Yet, I'm not sure if I can afford to do it with a proper DI with colour correction on a Da Vinci in a suite for two frigging weeks. Thats why I like Desktop DI. It allows me to use the same tools I understand, but on a feature scale. Don't get me wrong, I don't want to be a writer/director/producer/editor/sound designer/colourist, nor do I intend to be. There are plenty of talented people to do most of those things... plenty more talented than me. But some of them won't be in facilities with 'big iron', nor would I be able to afford them if they could. However, if I could put a Final Touch HD or Shake Setup in the budget of the feature... then we'd be sweet. I could get someone in to operate it, at least bring it to 99% finished, then take it to Big Iron Post for the final clean up.

From an artistic perspective, Desktop DI allows the new school of filmmakers (the so-called 'digikids') to do stuff their way, the way I've they've always conceived of using their tools... It allows us to practice, to become better, and one day (some of us hope) migrating to the big systems, with the big skill behind. How can that result in anything BUT better movies? [Issues of script quality aside]. Some fear that this going to cause problems with the post-industries, who can no longer convince clients to pay them what their staff is worth. It will certainly cause a degree of market diversification, but portfolio theory dictates that this a good thing. I agree. I only need a job done a a '5' quality, so why pay a '10' price?

Whats great about the desktop DI thing, is that its a step inbetween shitty dv-blown-to-film-aesthetics and full blown 35mm production. It allows filmmakers to have a choice. Shoot S16, finish to 35mm, with a DI for colour correction - and do it all for under X!

More specifically, this brings us to Australia. For some time I've been ranting about how we need to fully embrace the 'digital revolution'. Specifically, I think it can be used to LOWER the cost of production here. Either our films need to make more money, or cost less. So why not try and do both? If we acquire on film or HD, but adopt a word-class digital post workflow... with possible digital projection... We can make our films much cheaper. Cheaper films means less risk money-wish which means riskier scripts can be made. If there's an environment where risky scripts are getting made, that means more writers will be more creative, because they won't be as concerned about getting the mighty dollar.

... and better scripts WILL make better movies (ish). There are still directors to screw em up, but its harder with a good script.

Desktop DI also means our cheaper movies won't look bad, hell they may even look good! Slick even!... and what Australia DOES need is a few slick films, well made, well written, well acted, and well directed... Then maybe, maybe, we might start feeling a little national pride vis a vis our movies.

Someone try and prove me right... please?