Estimated read time: 15-16 minutes
SALT LAKE CITY — The Sundance Film Festival is a great time to sit down with people all across the filmmaking community to discuss what’s trending behind the scenes.
Over the next two weeks, we’ll be chatting with people who represent different fragments of independent film. We'll ask them what it takes to get that first project into a festival, then post the conversations in this series we’re calling “Sundance Chats.”
Recently, we had the privilege of chatting with Al Mooney, product manager over Adobe’s video editing software Premiere Pro. Mooney works with filmmakers at every level of the industry, including the editing teams behind 2015’s “Gone Girl,” and the Netflix original series, “House of Cards.”
Our topic? The technology that drives the process.
Travis Poppleton: Let’s talk kind of big picture at first, and then I'd like to narrow down to the Adobe product specifically. A couple of years ago, I heard one of the editors from "House of Cards" at Sundance, and he was talking about how they were shooting in 5K (5120x2880 resolution) and that just kind of blew my mind. What is the mainstream format a filmmaker would expect to shoot today?
Mooney: Yeah, so I mean, just real quick, the person at Sundance you were talking to, that's Tyler Nelson, and he works with David Fincher, of course, and “House of Cards” was a Fincher project, as was “Gone Girl,” which I have to get in these opening sentences because we're very proud of that being the first kind of AAA feature and to have a couple on Premiere Pro last year, and this is hugely exciting for us.
It's fair to say that those guys pushed the boundaries as much as the boundaries can be pushed. They shot 5K for “House of Cards.” They actually shot 6K (6144x3072 resolution) for “Gone Girl,” and part of the reason they do that is because Fincher is extremely particular about how shots look, and he likes to be able to over, what you might call, overshoot, so to pull whole frames out of a 6K image. He was pulling 4K (4096 x 2160 resolution), 5K frames out of 6K for "Gone Girl."
I think it's fair to say that not the majority of certainly independent filmmakers are shooting with 4K cameras. That said, it's certainly something that's increasing, and I think that it's interesting to sort of take a step back and look at what where we've come in terms of pure acquisition. It's been less than five years, and the world has just sort of changed dramatically, really.
A few years ago, we saw this kind of revolution — the almost accidental revolution — where DSLRs (digital single-lens reflex) still-image cameras just sort of, without thinking much about the added support for shooting video, just opened up a whole world of available quality for an independent filmmaker, which he or she just couldn't possibly have achieved before on that budget.
You can go buy a 5D Mark III or whatever it is for a few thousand dollars and a couple of pieces of glass, and shoot to a quality that basically looks like you've shot to film. If you look at what independent films looked like seven or eight years ago, they all looked like they were video, and that's because they were video shot on video cameras because that's what was affordable and on the market, and now they all look like they've been shot on super high-quality film or produced on film cameras.
Poppleton: This DSLR phase, is that something we would see a lot of here at Sundance? I would expect to hear about people shooting on whatever the latest RED camera is, but are there people actually shooting DSLR video that’s good enough for major festivals?
Mooney: If you had asked me that question three years ago, I would've said largely yes. I think what's been very interesting is the more traditional camera manufacturers, and I wouldn't put RED as a traditional camera manufacturer, but look at ARRI and Panasonic, and those guys who maybe were slightly taken aback by the DSLR revolution.
What happened is they kind of saw what was happening and then very much altered the type of cameras that they make, and so I think this year if you ask around, you'll hear a lot of people have shot on RED. I think a lot of people will talk about having shot on (ARRI’s) ALEXA. That's also a very beautiful film-like camera that is, in a sense, affordable.
I think the DSLR sort of accidental shift in acquisition that DSLR brought about was very much a kind of tipping point in the way that people thought about what type of quality they could acquire for the clearly very limited budgets that I would suggest most filmmakers have.
Poppleton: Personally, I shoot a lot of home video, and I think a lot of people do, especially with YouTube and Vimeo giving people a way to sort of distribute their own projects. And I think there's been just a general democratization of film that's been fantastic. If we just shift this over to Adobe for a second, I’ll often use Premiere Pro and it's funny to me with how simple the Adobe layout is, that you're saying films like “Gone Girl” are being edited in that exact same program. Is that entirely accurate? Is there a different version of Premiere, or is everyone from the Hollywood director to the guy making YouTube videos using the very same software?
Mooney: Well, no, there's no special secret version, although, actually when we work on something as important to us as “Gone Girl” or many of the other big companies we deal with, you can imagine they'll probably be using a version that we haven't actually quite released to the public, and that's not going to be like some crazy, super high-end different application.
It'll just have a couple of features that are specific to their workflow, and we'll have designed stuff with them, so we're working with them literally on a daily basis with engineers in-house. You sort of need to hold hands like that on a project that big. But yes, they were using products like CC, very much the same version that you have installed on your desktop.
You can imagine specifically for “Gone Girl,” you're working with thousands of hours of 6K footage and what you might call invisible VFX (visual effects), in the sense that you can't actually tell that's the VFX shot. It's not like aliens and explosions, but it's taking elements of one frame and compositing them into another.
Something that was extremely important to what they did, and something that is one of our sort of strengths, is the dynamic linked workflow between Premiere and After Effects whereby we could have, effectively live VFX shots from an After Effects VFX artist at live in the editing timeline.
Without going too in depth though, the short answer is, yep, they were using, bar a couple of little extra things we put in as we went, they were pretty much just using the same application that you get for $50 a month.
Poppleton: So Adobe Premiere is able to do 6K currently, like out of the box it's able to render 6K? I imagine that was a pretty amazing machine that they were working on it with, though, yeah?
Mooney: Yeah, you've just hit the nail on the head. There's one thing that I always like to make very clear when I talk in user groups. We call ourselves resolution-independent. The highest resolution you could actually edit today would be about 9.5K. Thankfully, very few customers are asking us 9.5K workflows right now, but I'm sure sometime this year that's going to happen.
But yeah, I mean, absolutely. We can support natively pretty much any resolution that a camera can capture and we edit natively in that resolution. Don't try and do it on a MacBook Air and a USB drive.
We worked very closely with a guy named Jeff Brue, who is the CTO of a company called Open Drives. The systems they put out for Rock Paper Scissors were mind-bogglingly powerful, so we were talking about 32-core CPUs, 128 gigs of RAM. We had Fusion Drives in every machine for caching — crazy fast beyond SSD. We have all SSD RAID arrays and an extremely fast fiber feed and re-edit client. Yeah, we didn't mess around that one, and that was a very, very serious installation.
Look, it's important to note we talk about being resolution-independent, being able to mix formats, being able to basically cut whatever you shoot and that is true, but of course, there is always a performance question that needs to be answered, and I always like to joke about the MacBook Air thing. If you're going to cut native 6K, you need to think about the system you're going to be doing it upon.
Poppleton: OK. Now, back to the average consumer who maybe is not looking to spend that, but he's just looking to make a good film and maybe get into some smaller film festivals, Aside from Premiere and probably After Effects, what other applications are filmmakers using right now?
Mooney: We have this thing called Creative Cloud. That's how we sell our creative products. I always sound like a mobile phone salesman when I say this, but it's $50 a month, and the beauty of it is you get every single creative application we make. If you think about what does a filmmaker go through when he or she makes a film from initial idea all the way through to final delivery, we like to say that we're, or we like to think that we have an application in Creative Cloud that works for every step of the way.
We see a lot of people at the very beginning using things like Illustrator and Photoshop for pre-visualization and for story boarding and so on. When you have got yourself an idea and you've got yourself a camera to shoot on, and again, I'll keep coming back to this, but it is just staggering how only 10 years ago, the camera would be tens of thousands of dollars and the editing suite would be tens of thousands of dollars, and now you're getting this amazing amount of power for next to nothing.
Then you go and start acquiring media. You start shooting. You do your interviews, whatever it was, and then when you're bringing that media into the system for editing and further elements of the creative process, we then have an application called Prelude, which is very specifically designed for the early part of a workflow, the first half of post-production, if you like, which is the ingesting of media, the logging of media, potentially the transcoding of media, but most importantly, it's looking at your selects, looking at your dailies, gathering selects, marking them up with metadata so that, because obviously, in file-based workflow, meaningful metadata is crucial to the discoverability of media and making sure you're picking the best possible things you can.
You can go to Prelude and do all that logging. You then go into the meaningful edit, and that's obviously Premiere Pro. The way we talk about Premiere Pro is very much the center of a workflow. We find that most certainly most independent filmmakers, and actually, this is how Fincher worked, that the timeline is the most important. The edit timeline is the truth. That's the most important part of any piece of the creative process.
There will often be times when, depending on what kind of movie you're making, you will want to go and, like you said, use After Effects for stuff. That can be editing as simple as stabilization or composite through to adding muddle flashes or adding rain when it wasn't raining and all the creative things you can do there. But we really talk about Premiere as being in the middle, and we can go out to the other applications and kind of come back to Premiere Pro, that being where the truth lies.
If I need to add some complex, let's say, complex VFX, I can go out to After Effects and I can come back into Premiere Pro. If I need to do more advanced audio when audio becomes more a part of my creative process, I can move out to Audition and do some effects track playing, whatever it might be, and come back to Premiere Pro. Once I start thinking about color, I can go out to SpeedGrade and come back to Premiere Pro, so really, we describe Premiere as being the center around which the other applications orbit with a very, very simple path out to those other bits of the workflow and applications that allow you to come back in.
I think really what's unique about the Creative Cloud offering is you have pretty much every tool you need. You can then go to Media Encoder to deliver your DCP dictator to projectors. I think that's what really resonates with independent filmmakers. They can pay one price and pretty much do everything they can think of, and I think that's quite powerful.
Poppleton: Right. Again, when you were talking about what was tens of thousand dollars, and now it's just an actually pretty reasonable monthly fee, it really does bring it all to the filmmaker in a very accessible way.
I know we're cutting short on time, but I’d like to talk about the submission process for, say, a film festival? Are most people just asking for digital files online? Are they just uploading a file to some server? Or do they eventually still need to print their projects out to film?
Mooney: Oh, right. I will slightly swerve this question; that’s not something I've ever done personally. I know you're going to have the opportunity to meet with some of the filmmakers at the festival, and I'd encourage you to ask them about the process. I'm good friends with Kyle Alvarez, who you're going to meet, and I know this is the time when they're all sort of tearing their hair out and getting 20 minutes of sleep a day.
The short answer is you would submit a file that's watchable over a standard machine, that'll then go through the various motions of the approval process. Once you're approved, generally what they'll want is a DCP, a Digital Cinema Package. It's extremely unusual, and I would say almost nonexistent, for a festival to ask for film now, which is good, because actually laying off to film would be probably the most expensive part of the process that we've been talking about.
It's generally a DCP, like I say, a Digital Cinema Package. Most often, the film festivals are in smaller theaters with 2K projectors, so we actually did a deal with a company called QVIS whereby you'll find you can export to a 2K DCP from Media Encoder without having to add anything else. For those that request, if you cut 4K, and as we said at the start, many more are, and if 4K DCP is requested, then we have workflow for that whereby you basically just spend a bit more money on some QVIS software.
The short answer is the application process will be a smaller file like 264s, but for the actual screenings you'll be almost definitely required to provide a DCP.
Poppleton: Is that pretty much an industry standard now? Is that how things are kind of working around Hollywood?
Mooney: Yeah. Absolutely. Yep. I mean, the DCP is the format that you, any nonanalog projector basically in the world now ... I'm saying that and questioning myself, but I'm pretty sure this is the case. DCP is the digital cinema projection format.
Poppleton: Finally, I've noticed Adobe sponsoring screenings around the festival. I'm curious, are these specifically films that have used your software and have used your tech? What specifically made you choose to put your name alongside individual titles around Sundance?
Mooney: The particular movies we're championing, of course, it's because they've been using our tool set. What I would say, though is, we do like to go further than that. We like to be, and it sounds a bit corny, but we like to be what we call, sort of, thought leaders in the industry.
Really, the big reason we go to these film festivals is to exhibit thought leadership, to kind of host these panels and engage with filmmakers, but also to, by engaging, to get feedback from them in order to help build better products. Really, this is an important part of the market for us, so we want to show that the world that we care about this industry, and at the same time kind of be able to pull back feedback from it to make better products in the longer term. I think that's kind of the primary reason we do these things.
Travis Poppleton has been covering movie news, film reviews and live events for Deseret News and KSL.com since 2010 and co-hosts the FlixJunkies podcast. You can contact him at firstname.lastname@example.org.