Old Ben, New Tricks
A Conversation with Chris Moseley & Chris Herr
By David Daut
Seventeen years after Star Wars: Episode III hit theaters, Ewan McGregor returns to the role of Jedi Master Obi-Wan Kenobi, now cut off from the Force and hiding from the Empire after the fall of the Jedi Order. An urgent message pulls him out of his exile when the young Princess Leia is kidnapped by Darth Vader’s agents and Obi-Wan is forced to confront the truth that his former apprentice still lives. The six-episode limited series is directed by Deborah Chow and stars McGregor along with Vivien Lyra Blair, Moses Ingram, and Hayden Christiansen reprising his role as Darth Vader.
The spirit of Star Wars has always had one foot in the past and one foot in the future. George Lucas’s original films were nostalgic throwbacks to radio serials, Saturday matinees, Westerns, and samurai movies, but they were also groundbreaking technological achievements that changed the art of filmmaking forever. Star Wars is, as ever, about balance, and that spirit of balance between old and new continues in the limited series Obi-Wan Kenobi.
Camera Operator had the opportunity to chat with A camera operator Chris Moseley and B camera/Ronin operator Chris Herr about their work on the series and about bringing a new, more modern look to this story about familiar characters. We talked extensively about Industrial Light & Magic’s StageCraft technology and the LED Volume at its heart, which can conjure up fantastical worlds rendered on set in real time, and the new challenges and opportunities for camera operators that comes along with it. Also discussed was the unique shooting setup for the series that relied primarily on gimbals instead of Steadicam, working on a show that at times felt more like a movie than television, and the sheer childlike awe of seeing Darth Vader right in front of you.
On July 28, 2022, the Television Academy announced Industrial Light & Magic would be awarded an Engineering, Science & Technology Emmy Award for its StageCraft virtual production technology.
This interview was recorded following the release of the third episode of Obi-Wan Kenobi.
Camera Operator: With its use on shows like The Mandalorian and The Book of Boba Fett, as well as in recent feature films like Matt Reeves’ The Batman and Thor: Love and Thunder, I think it’s probably safe to assume that a lot of people are at least broadly aware of Industrial Light & Magic’s StageCraft technology and the LED Volume at the heart of it. But for anyone not familiar, could you give a quick overview of what that technology is?
Chris Moseley: Basically, the Volume is a curved LED screen that wraps around the set. On it, you can load in any backdrop you want as well as elements like digital people that move, spaceships that fly by, interior or exterior sets with blinking lights, a sun, a planet, anything. It’s very realistic. On top of the screen, the Volume is surrounded by witness cameras that are connected to the camera we’re shooting. They track our camera as it moves within the environment, and the image on screen matches those movements so that the perspective is always correct. The benefits of it are great compared to just having a blue screen there. It really makes it easy to compose your shots because you can actually see what your background is going to be, and it helps the actors immensely because they’re able to react to the environment instead of looking at a blank screen. And photographically, all that light plays on the actors’ faces. That’s what makes it so realistic. Maybe Chris can get more into the technical aspects of it.
Chris Herr: I had all the nerdy conversations with many people. I mean, there are, like, 20 people that are just there for the tech side of it every day, just to keep the whole thing running smoothly. Those people are brilliant and are a wealth of information. Chris mentioned framing; the interesting difference between the Volume and blue screen is that if there’s an element like a moon in the sky, or another planet on the horizon, you can adjust your frame accordingly on that day as opposed to letting the VFX compositors compose your frame in post. So you’re the one making decisions on placing that moon or something in the sky in between two people or behind someone.
Moseley: You can actually see what’s going to be in frame instead of someone saying “there’s a big creature over there coming at the camera” without really knowing how big it is. So much of what we compose for are backgrounds—structures, people, even colors—and being able to see all that is incredibly helpful.
Herr: One thing I noticed is if you have 100 background actors in there—which can all fit in the room since it’s about 80 feet across, and you can really pack it in—and if you have, say, a ship flying in, then all 100 people are looking at the same object as opposed to shots where everyone’s supposed to be looking at the same thing, but the eyelines don’t all match up. I don’t know that everyone notices that, but I definitely pick up on that difference. On the technology side, what I think makes it so realistic is they have the world built in ILM’s StageCraft software. It’s fully built out in 3D space, and this 80-foot room exists within the 3D world, so that 80-foot space is our bounding box that we can move around in. Then they overlap the virtual camera with the position of the real camera and project that onto whatever field of view we’re seeing. The position of the camera within the Volume determines the the parallax, the compression, the perspective shift, and everything that’s projected onto those LED walls. It’s really cool to see through the monitor. Say there’s a road or a path going away from you, and as you’re moving around, that path just continues out into the screen. The only way to break the illusion is to step away from the camera, then you’ll start to see that it bends off and the perspective no longer makes sense.
Moseley: Yeah, or when you look on the ceiling—because the Volume has a ceiling too—all the buildings look bent over, but when you tilt the camera up, and the witness cameras track that movement, then the buildings go upright. It’s really interesting.
Herr: They can also drag in blue screens by creating a box on the screen that they can change the size and drag around instead of bringing in a physical rag. You’re just adding these blue squares for areas where, say, they don’t have something rendered, or they don’t want to commit to a certain animation. An ILM artist comes up with an iPad and drags in a blue screen. For that matter, they can drag in lights or they can drag in negatives, so there are these black and white or tungsten squares floating around. A lot of the lighting was done in that way, just controlled from an iPad.
Moseley: Any color, any shape you want to make it, you can. And you can move it right to the frame line.
Herr: You can feather it. They even would tease those off. They’d set those squares up, and they’d bring in real teasers to tease that section off as if it were a big light panel.
Moseley: It makes the lighting really quick and easy, too.
CO: It’s such an interesting piece of technology. And obviously that versatility you’re talking about was critical to the development of it in the first place. Going back to The Mandalorian and that idea of wanting to do a Star Wars show that has the sweep and the scope of Star Wars where you’re visiting different planets, and figuring out how to do that on a TV budget and a TV schedule. And Obi-Wan is no different. At the time of this interview, we’re three episodes into the series and we’ve been to half a dozen planets at least. What is it like shifting between these different virtual environments in this virtual space?
Moseley: Yeah, almost everything has to be planned ahead of time. They have all the screen load-ins built early on. There’s lots of concept art of sets and different planets. The physical space of the sets has to be figured out so they can marry up to the backgrounds. Almost everything is prevised. It’s much more prep-intensive than a regular movie. We did have a big exterior town built, as well, where the bad guys land in their big ship in the first episode.
Herr: Oh yeah, Tatooine. There’s also Alderaan, Leia’s home planet, or where she was adopted to. [Daiyu] is the Blade Runner–looking planet, which was cool. That was fun because they did a wet-down on the stage so you get all the reflections, all the neon playing off of the wet ground, which really sold it.
Moseley: Yes, that really looked great. Anything that the screens can reflect on makes it look very real.
Herr: I think the darker scenes with little highlights like practical lamps, holes in a grate where light’s coming through, that stuff works really well on the Volume as opposed to the bright desert sets where you have bright sky and a bright floor, but say you want to see a cave entrance somewhere. That’s gonna be really hard because now you have all this ambient light fighting for a real dark point. The contrast ratio is easier to maintain on the darker sets as opposed to when more of the set is bright.
Moseley: What also works well is if you have a practical set inside the Volume with windows that look out to the screens. Instead of having a blue screen outside the window, you can have a moving environment, day or night, with any type of light coming through the windows.
Herr: Yeah, that was like Leia’s bedroom in the palace on Alderaan. That was a practical set with glossy floors. That’s something you can do that’s kind of unique because you can have these glossy floors and shoot low on those floors and see the natural reflections. You don’t have to then paint out the blue reflection in those floors that you usually have. You can just shoot it as is.
CO: Talking about that balance, obviously a lot of the series was shot on the Volume using the virtual environments, but you still have some practical sets and even some location shooting. The third episode largely takes place on this sort of Joshua Tree–looking mining world with mountains and desert plains. Am I correct in assuming that that was a location shoot?
Herr: Yeah, that was on location at Mystery Mesa near Santa Clarita. So, that was all real. Obviously, there are a lot of extensions; they added mountains, they added all the mining infrastructure, the factories, smoking smokestacks in the distance. We did bring in the Joshua trees. Those are real, those were stuck in the ground. I think the challenge with that is the hard sunlight. That’s one of the things that they’re still trying to figure out; how to get hard light into the Volume. If you have hard light hitting the ground, that point where that transitions into the LED wall, it’s hard to match that brightness. And then there’s an issue where the light hitting the ground then lights the screen, so you lighten up the screen to match, but when you lighten up the screen, you’re now lighting the ground even brighter. You end up with this feedback loop. That transition was always something they were tweaking and trying to get right. Especially on the wides—the big wides where you kind of see the roundness of that stage. I think they definitely had to go in there and tweak a little bit, but there are plenty of instances when we’re shooting normal lenses, tighter, that you cannot tell where it starts and ends.
CO: That’s so interesting! Speaking to some of these unique situations in the Volume and the lighting challenges that you were mentioning, how different is it shooting on the Volume versus shooting on a traditional set or location shooting? Are you using similar cameras and lenses, or is it all totally different and unique?
Moseley: Well, we’re using really all the same lenses and cameras. When you’re shooting on the Volume, large format cameras tend to look best. You try to keep the depth of field down so things look better. We also try to get lenses that are more old-school—lenses that aren’t super sharp. Softer lenses help the LED screen to not be too sharp. We shot a lot with anamorphic lenses as well as spherical lenses for certain things. We had a mixture of lenses, but mostly we just try to keep the depth of field down. You have to pay attention when you’re on the Volume a little bit on how you shoot things.
Herr: Also, it’s hard to shoot two cameras. Three cameras? Forget about it. Basically the field of view of your camera as it hits the wall is your background, and that travels with you as you move around. So if someone’s shooting a wide two-shot, and then somebody wants to do a tighter single, you’re now within the wider shot’s “frustum”—in other words, the rendered area of the screen. You can shoot within that if you’re tight enough, and it can kind of work even though the perspective would be from that other camera. But if that camera pans away, you’ll see this blurry line come through your frame where it goes from the rendered frustum back to your field of view that they stack underneath because there’s a hierarchy of cameras within the Volume. They’ll prioritize the wide and then give second priority to B camera, whose frustum would go behind the A camera instead of busting the A camera shot with their background. They can outline them in red and blue just to see where they are, so there are these squares flying around the Volume as you point the camera around that show what your field of view is. So, that was a challenge, but we found ways to work around it, like shooting a wide and then going in real tight within that so you don’t notice the perspective being off. But three cameras in the Volume is definitely a challenge. I’d say the only time we did three cameras was on “dirt shots.” They call it a dirt shot when it’s literally just practical sets, mostly looking down. You can throw those in anywhere. But yeah, multi-cameras are a challenge.
CO: That’s something I hadn’t even considered. The perspective is tied to the camera. And even forgetting any issues of physical logistics or physical space, just the sheer rendering power of it, if you’re shooting multiple camera angles, each camera has to be rendered from its own unique perspective. That’s very interesting.
Herr: They can make the frustum wider than what the camera’s really seeing to accommodate another camera. They can make it half the Volume from that perspective so you can shoot within their frustum. As they’re panning around, it’s not going to kill your shot. The background is always just like a static panorama of that set by default. When you don’t have any cameras in there, it’s just a static 360 image. When you bring a camera in, that’s when the perspective starts to change. So as you pan around, you go from the 360-degree still image to a live perspective. Sometimes set pieces will move around as well, like you’ll pan over to some ship and in reality it’s not where it is in the 360 image. It’s 30 feet to the right because physically in space, you are in a different location. It gets a little confusing sometimes.
Moseley: But they can always fix it in post if they have to, if they really want the shot.
Herr: Well, with the budget we had, they can fix it in post, but I imagine in the future little productions are gonna go in there, and they have to get everything in-camera final. We would strive for in-camera finals as much as possible, but there are times where we’re doing something where they have to paint out cameras, or they have to replace the ceiling, you know, various fixes. But overall, I think it definitely helped the look of the show, especially with reflective things, helmets, or just even in somebody’s eyes. You don’t have a blue screen in their eye, you have the real city lights or the full environment.
CO: Sort of speaking to the logistical and spatial challenges of shooting in the Volume, the first couple episodes of Obi-Wan Kenobi feature some chase sequences. In the first episode, you have Leia running away from her kidnappers on Alderaan, then in the second episode, on Daiyu, that sort of neon-lit, Blade Runner planet, you have the bounty hunters going after Obi-Wan. How do you go about shooting a chase sequence within the physical constraints of the Volume?
Herr: There are a couple of ways we extended that. In the chase where Leia’s running through the market, they built an S-curve through that space, so there were three zones to be in, and you could cross through them. We had an RC car with the Ronin on it running through the stands and under the creatures’ legs.
Moseley: Yeah, it’s hard to shoot these chases inside the limited space of the Volume. You run an actor 20 feet, then all of a sudden they’re against the wall, and you can’t really shoot people right next to the wall because you will see the screen too well. You have to find ways around it. A lot of times you have to do it in cuts. Get them running one way, turn around and get them going the other way.
Herr: The rooftop fight, that was blue screen. They actually built that on one of the larger stages and had two or three rooftops and it took a left turn and they could keep it going. It was fun, because I got to get wired up from the ceiling to run across and jump the gaps with them. And pulling Ewan across the roofs backwards, that is tricky as well—they built little foot bridges for those—but that was a fun scene. And then the stuff in the distance of the people fighting, or the different bounty hunters shooting back at Ewan, that was some second unit work. We would shoot Ewan firing at basically Xs on the blue screen, then they would shoot those other elements and add them in later. So, that was a more traditional approach. But for things in the Volume, what they would do is have three rendered Volumes—an A, B, and C—and they would all overlap. They had CAD models of the set environments, and you could see the outlines of the dimensions of the Volume. That outline would be copied and pasted, so that throughout the day, we could shoot Volume A—running, pulling, pushing—then switch over to Volume B by moving the virtual space and the set dressing. For example, [in Episode 2] in the ship dock where they are running away from the Inquisitors and trying to board the cargo ship, the Grand Inquisitor gets stabbed. At first they come into an open space and you see all the containers in the distance—those containers are digital objects on the LED wall. Next he’s walking through the containers, so now they bring the physical containers into the set, move the 3D world forward, so now when you look back, you’re looking at the open space they came just from, but now that’s what’s represented on the wall. Then they do that one more time for the actual ship dock where they get into the ship. A scene like that is shot over multiple days, getting everything in each section. So, first we’d shoot Obi-Wan and Leia’s entrance—the two of them talking and then going to the crates—and then you have Reva’s entrance, which you’d shoot that same day, in that same Volume setup, because it takes too much time to go back to a previous setup. You’re sectioning the shooting off for each setup, like you would do with a normal set, only instead of building one giant set, we’re dividing it up into smaller setups.
Moseley: What they’re really creative at is making set dressing that’s identical on both sides of the set. That way when you finish shooting in one direction, you don’t have to turn around, you just flip the background plate and you don’t have to move any set dressing. You’re still pointing the camera the same way, but now you’re shooting the other direction.
Herr: Which made eyelines and look directions kind of complicated. We’d be looking one way—someone’s looking left to right—and then we say “okay, we’re gonna turn around,” everybody goes outside, they load up a new background and we’re essentially looking the same way, but that person is now looking right to left. They’re still in the same space, but we just flipped them to match the background. We just have to make sure the eyelines work, because you’re not actually moving the camera, you’re moving the actors. The camera stays in one spot.
Moseley: It takes a little bit of thought once you come back to the other side.
Herr: Where were we pointing? Because there’s a door; there’s a section of the Volume that’s open for stuff to get in and out. They can kind of close it off, but they’re really just lighting panels, they’re not connected into the main wall. So, it made it easy in a way to turn around because you didn’t have to move stuff, but it also made it hard in a different way.
Moseley: It makes turnarounds extremely fast. If you were lighting from one wall of the Volume, you turn the Volume around and your lighting has flopped as well.
CO: It’s an interesting sort of head trip, because in a certain sense you are moving the world instead of moving the camera, which is kind of wild.
Herr: And they can do that! They can rotate, they can shift. And that was a trip. When they would go to rotate the world, say, because we’re seeing one of the doors, we would pan over to another section of the wall and they would move the world into frame. And that’s weird, when you’re standing on solid ground and everything around you starts to spin. It is a little disorienting sometimes.
CO: Oh, I bet! Going back to what you were saying about shooting in the ship dock and having these three different pieces of that set loaded up in the Volume at different times. One of the interesting things about this technology is it really necessitates decisions to be made a lot earlier from a visual effects and virtual environment perspective than they would on something that was shot entirely against blue screen or green screen. How does that impact you, as camera operators, having these virtual environments built beforehand versus only having an idea of what it’s going to be once it’s finished?
Moseley: Yeah, absolutely. There’s so much thought that goes in ahead of time. Months and months before we ever get there. They have to create the worlds, they have to create the set pieces that go in the world and figure out how the practical set pieces will match up with the screens. Pretty much everything is storyboarded, and the DP will get in the Volume and color correct some of the backgrounds and add his lighting that matches ahead of time. When Chris and I get in there, pretty much everything is set and ready to go, but we still have freedom to change the prevised shots to make them a little more dramatic or a little more interesting.
Herr: They had pretty detailed 3D previs of almost the whole series aside from the fight scenes, where they had run-throughs of the stunt performers doing the fights. Deb, the director, had her iPad, and sometimes she’d come over here and show a few references of fully animated camera moves for us to base off of. Then we would go in and say, “Well, this looks better with the way the background actors lined up,” or, “The way that we shot yesterday, it makes more sense to do this.” So, in a way, it’s really constricted from a production design and lighting side, but then once you dive in, you actually have a lot of freedom with the Volume. You can pretty much look anywhere. You’re really not having to then stick to those boards. I think the boards just give everyone else a template to do all their work.
CO: You mentioned Deb Chow, who obviously had worked with this technology before on The Mandalorian. What was it like working with her on this with her directing every episode of the show?
Moseley: Deborah Chow is really great! It was interesting as far as TV goes, because usually you’re bringing in a different director for every episode. But this was her baby—producing, directing—it was all her, which is a huge task for one person and very stressful. I wouldn’t want that much responsibility, but she took it on!
Herr: She knew the material really well, too. Front to back.
Moseley: Yeah, all about all Star Wars all the time.
Herr: But she let us take the reins when it came to the operating and the minutia of eyelines and frames and stuff.
Moseley: She was good like that. She had so much on her plate, it’s just daunting, because it’s big stuff. You’ve got all those creatures and extras. I think she probably answered a million questions a day. Wardrobe and makeup. And I mean, there’s only one person to go to. There wasn’t nine or ten producers to field stuff. But she’s very smart and a good collaborator. She gave us a lot of freedom in blocking shots and how to shoot them.
Herr: Then you have the passionate Star Wars fans that you have to please. It’s an army of people that are going to see your every mistake and misstep and oversight, so she was really meticulous with getting all the details, right.
Moseley: Yeah, probably more than any other series. There are some serious Star Wars people, and if you do something wrong, they’ll call you out on it.
CO: Well, for what it’s worth, I was at Star Wars Celebration this past weekend, and I actually got to see the premiere of the first two episodes in a room with a whole bunch of people and there was a big reaction. People loved it. By that metric, at least, I think you all succeeded.
Herr: That’s nice to hear.
Moseley: Yeah, that’s good to hear! I think at least for me, one of my big worries was little Leia, who was only eight years old, and we all know that it’s hard for eight-year-olds to be on a set for ten hours a day, but she’s amazing! Her acting is so good!
CO: She’s phenomenal! And kind of digging into that legacy of Star Wars, this series is set almost exactly halfway between Episode III from 2005, and the original Star Wars from ‘77. How much did you look to those films and, more broadly, what had been done in Star Wars in the past to influence your work on this show? Or were you more charting your own course and letting this series find its own style?
Moseley: Well, I can speak for myself. I know Chris is a pretty big Star Wars fan, and I know he delved into all of Star Wars and probably knew them already. For me, I saw the original ones back in the ’70s and I saw Rogue One, which is the one that precedes them, but I think Deb and [DP] Chung-hoon Chung wanted to do something different from the other Star Wars films, and it’s not like The Mandalorian either. I think with the success of The Mandalorian, it would have been easy to try to copy that style because it is so good, but for Obi-Wan, we wanted something that feels more like a feature film. We tried to make it more modern, much more moving camera, much darker. A lot of Star Wars is shot very classically in static frames; the fights will play in a wide, very static shot. We tried to not do that and instead do a lot of moving camera, much more modern lighting. As much as we want honor that legacy of Star Wars, we also wanted to give this its own style.
Herr: It’s also a lot more handheld. I think after Episode 2 that gets ramped up even more, but I do not remember seeing much handheld in the original stuff.
CO: No, not at all.
Herr: It’s all wides. Occasionally they go in tight somewhere when they need to, but that’s when they’re static and having a discourse. Then they go back to fighting and it’s on all these wides. I think they did a good job of still getting wides of the fights.
Moseley: Yeah, wides for sure, but moving wides with a lot of different tools. I haven’t seen the handheld episodes yet. We’ll see how that all plays.
Herr: We’re getting in there too. When we do the fights, we’d play one wide and then we’d play one real tight, which was a challenge. Shooting close-ups of people tumbling around with these light bars—lightsabers—flying through the frame. Trying to find the perfect base in all that was a challenge, which leads to how we did a lot of that. Some of it is traditional handheld, and then we also use the Ronin in a dual operator scenario. They wanted to stick with the handheld look, but doing movement in true handheld just gets way too shaky. We use the Force Pro to control the Ronin, so Moseley was on a Mimic controller in some of the fight scenes. I’d also move around with the Ronin in some fight scenes and Moseley would be able to give it that handheld feel while still being able to travel quickly. Going down low with them as they’re on the ground and back up, it’s so much harder to do with traditional handheld with these heavy builds.
Moseley: We did a lot of dual operating with the Ronin where I would either be on the wheels or with the Mimic rig and Chris would be running around following actors. And that’s completely different from anything they’ve done in Star Wars. Obviously they didn’t have the technology before, but even as far as Mandalorian goes—which I worked on a little bit as well—they don’t shoot it like that at all.
CO: Speaking to the differences between this and The Mandalorian, you talked about how this feels much more like a movie than TV, and Obi-Wan was originally conceived as a movie before it was later expanded out into a six-episode series. Obviously, that decision had been made long before shooting ever began, but do you feel like that early conception influenced how you how you approached shooting on the show?
Moseley: Yeah, with The Mandalorian, John Favreau is a big fan of the Western look, and that’s kind of like how the original Star Wars were done with those wide, low angle, cowboy-comes-into-town shots and the Sergio Leone over-the-pistols shots. That’s kind of how Mandalorian is shot, but I think Chung-hoon wanted to make [Obi-Wan] his own. He brings a lot of photography from Asian culture. He’s Korean. I think he’s influenced by a lot of martial arts movies where you have a lot of low angles, but also a lot of high angles looking down, and all these fight scenes where it’s choreographed almost like a Busby Berkeley number. He brings that to it, then Chris and I—because we do a lot of Ronin stuff together—we’ve brought in this moving camera style. So, I think between those things, we made our own photographic footprint on Star Wars.
Herr: We had an interesting way we could ramp from traditional dolly, crane, fluid Ronin work into a hybrid of Ronin and handheld where we didn’t want it to be too crazy, then into full-on handheld, where it’s just grab the camera and go embrace the craziness of it. We used the ZeeGee a little bit too. It’s meant to go on a Steadicam arm, but you balance the camera passively on this gimbal. It allows you to do handheld while you’re on a Steadicam arm. We used that for when Reva’s questioning all the townspeople and chops the girl’s arm off. That’s all ZeeGee work because it lets us move really fast, but still maintain some kind of composure over the handheld. We do a lot of that. And that same company makes the AntiGravityCam rig, which is a bungee rig for the gimbal that let us get a lot of the really low angle tracking shots as well. It was nice to have that gradient of options.
Moseley: Yeah, we ramped it up. We started out with a lot of crane work in the first two episodes as we’re establishing the characters, then as the chases and the fighting gets more ramped up, we used a more frenetic style with the camera.
Herr: And I think at some point we also used the MATRIX head with their Mimic controller to do the same thing. So, we had a Scorpio 45 every day, and they would send the MATRIX up on the fight scenes, and Moseley would be on it, operating the Mimic handheld there. I think at that point, I was just normal handheld, below Moseley, running around, and we would tag-team it that way. That’s in the later episodes, so we can’t really talk too much about what those are.
Moseley: I think it works well for the fight stuff. We tried to calm it down during calmer scenes, then ramp it back up for the fights.
Herr: If you notice, even the stuff that looks like it would be Steadicam is actually Ronin with a little bit of an uneasiness to it, which is that Force controller injecting that in. I’m trying to set the record straight that it’s not me doing shaky Ronin. [Laughs] And we did traditional Ronin single-op as well, like out in the desert where I’m just running around all day shooting moving overs and pulling them across the mining planet. We did another interesting thing where I’d take the Ronin and we’d use the FLOWCINE GLINK—I gotta mention them because we used a lot of that rig—we’d turn the Ronin off, lock all the axes, strap them down so there was no play, and we would do some things just like that. I’m in my rig and just operating these shaky handheld shots, but have a nice boom range—even more so than you’d have on an Easyrig. You have basically two Steadicam arms on the GLINK letting you boom. So, we were able to do that, and I could unlock it and reengage it to do the walk-and-talks and Moseley would go back on the Mimic and give it that handheld feel as I would pull them. If you did true handheld walking backwards, it’s going to be more shaky than they wanted in that moment. They gave me one of the first pre-production models of the GLINK for this show because we didn’t have Steadicam. and that’s also kind of unique to this show. Moseley doesn’t do Steadicam, I don’t do Steadicam—I come from the gimbal world—so they just said “yeah, we’ll do gimbal for all the movement.” They just committed to that fully. But I needed the rigs that allowed me to get in the places that I needed to get in. Some of these rigs they have support from overhead, like the AntiGravityCam, which is great for getting low, and then the GLINK supports from below like a Steadicam arm, so you don’t have anything above your head. So for little cave sets or getting into corners, that thing was nice. I don’t know how we would have done it without that tool. It really opened up a lot of possibilities for us.
CO: I know there’s not a lot you can say about the remaining episodes of the series, but without getting into any specifics that might get you in trouble, what were some of the most exciting or rewarding parts about working on the show?
Herr: For me it was all the practical costumes and props—weapons and droids and robots and the stormtroopers. Everything you see is pretty much really on screen—C-3PO, Vader—they’re all standing in front of you, in full force, looking how they do on camera.
Moseley: It’s amazing. Even if you’re not like a complete Star Wars fan, when Darth Vader comes on the set, it’s pretty awesome. Or any of the droids. So, it’s always cool. Sometimes you’d walk into one of these Volumes, and you’d look around and think, “Wow, this is unbelievable!” So, it was always a surprise, and I think just doing a Star Wars thing is so fun.
Herr: The sets really were huge. The amount of foam and wood they use to construct these worlds—we had probably six or eight stages at Manhattan Beach, and at lunch sometimes I would just walk around and see the progress of the next week’s set builds. So, there’s always this revolving door of sets being built and torn down and built and taken down. And there’s a lot still to come in terms of new sets. Vader was cool. We had Dmitrious [Bistrevsky], who played the suit version a lot, and also a stunt performer who would do the fight work. And then Hayden [Christiansen] was in it a few times. They had a movement coach named Olga [Sokolova] training him how to move like the original Vader, because Vader doesn’t walk like a normal human. He has this certain presence about him. So, Olga, she’s a professional ballet dancer, they brought her in, and then eight other people to manage Vader’s cape and his helmet and the lenses in his eyes.
Moseley: It was a whole team of people.
Herr: They had a chair they’d bring around for him to sit on between takes, and the electronics in his suit, the lights—it was a whole department in itself. The Vader team.
Moseley: You gave away that Vader’s coming on this.
Herr: [Laughs] It’s Episode 3! Everybody knows! Don’t scare me, Moseley! Don’t scare me like that.
Moseley: Well, you know our phones are bugged. [Laughs]
Herr: This is kind of a fun story. The first day of shooting—we were up in Lake Arrowhead area—I was using the Artemis app and framing up a shot with Chung-hoon, saying, “Here’s what the 29 looks like. Here’s what the 50 looks like.” After we finished that, I walked back over to set and this guy comes out of the woods—this guy who looks like he’s out of Men in Black—and he pulls me to the side and asks, “Were you taking photos?” I tell him it’s a viewfinder app and we were lining up the shots. He asked me to open it up, I had to show him and I had to delete those photos for Chung-hoon’s reference. You just could not bring a phone out and hold it up, and that became a problem because we use Artemis so much, so eventually they made a concession where we signed our life away to say we can use this one app and our badges got a special sticker on them so they could know we can use this app. So, there are challenges just working on something with so many secrets.
CO: Oh, I bet!
Herr: Another fun moment for me was in Episode 3 when Obi-Wan first brings his lightsaber out. We had to do a POV of him spinning around in that gravel mine, and we did the same thing—strapped down the Ronin on the GLINK—and I was able to do my Obi-Wan acting, panning around, looking for Vader. Then at one moment, they said, “Well, we should do one with the lightsaber. There’s something missing, we don’t have that blue glow.” So, they gave me his lightsaber, all lit up, full prop and everything, and I did that POV, swinging the lightsaber across the lens a couple of times to give it that effect. They ended up using those shots, so I’ve added Obi-Wan lightsaber double to my resume.
Camera Operator Fall 2022
Camera operator Chris Herr, Vivien Lyra Blair, Jimmy Smits & camera operator Chris Moseley on the set of OBI-WAN KENOBI
Photos by Matt Kennedy and Nicola Goode, Lucasfilm Ltd.
TECH ON SET
DJI Force Pro Camera Movement System
MATRIX Remote Head
FLOWCINE GLINK Gimbal
Cinema Devices ZeeGee
Cinema Devices AntiGravityCam
BEHIND THE SCENES
Select Photo for Slideshow
Chris Moseley is a director of photography and camera operator who has been working in the film business for 35 years. He was born and raised in California, where he studied at the American Film Institute.
Chris Herr is an LA–based camera operator specializing in gimbal operating. He began gimbal operating after the advent of the original Mōvi M10 gimbal in 2013 and has since completed 18 feature films and over 100 commercial projects. He’s most known for his work on Straight Outta Compton, A Star Is Born, and Obi-Wan Kenobi as well as his company Blackbird Cinema, INC.
A writer and critic for more than a decade, David Daut specializes in analysis of genre cinema and immersive media. In addition to his work for Camera Operator and other publications, David is also the co-creator of Hollow Medium, a “recovered audio” ghost story podcast. David studied at the USC School of Cinematic Arts and works as a freelance writer based out of Orange County, California.