Taken from SONYCine – providing the cinematography community with information, inspiration, and news

Snehal Patel is a film and television professional with over two decades of experience creating content and adapting new technology. 

Snehal is also a FOFO … (Friend of Factory Optic) – he has known and worked with our boss for many years – and – very recently, he worked directly with our team in testing our new SYNCRO-LINK MARK ZERO. Snehal was impressed. And, he’s a guy who really understands the importance of capturing metadata correctly … listen:  


So how do we benefit from this on set?


For virtual production, you need this data in real-time because what ends up happening is if you have a wide angle lens and you’re doing a Bugs Bunny basketball movie, like Space Jam, you have animated characters standing next to live actors or basketball stars. This is a very common thing with virtual production. You have AR [augmented reality] objects or background objects, or you’re standing on a floor. Now, if you’re standing on a real floor in real life, if I put a wide angle 18mm lens on the camera, that floor bends. Actually, that floor appears to be lower at the apex, at the bottom than it is in real life, because of the curvature of the lens. The further you are away from the center of the lens, your corners will bend even more, as will your walls.

… let’s say you created a cartoon character standing next to you, or jumping around you, it won’t be standing on the ground. They’ll be floating maybe an inch off the ground. Their feet won’t appear to be touching the ground because we haven’t accounted for the distortion. Not only do you want to track the shading and distortion characteristics, but you need to apply them in real time, through the Unreal Engine, for example, in virtual production, otherwise you can’t line up your objects.

… let’s forget about shading and distortion data for a minute and instead look at some more simple things, like how this data is used for the TV show Mandalorian. You need to be able to have the exact focus and iris distance from the lens in real time going into your system in the Unreal Engine, where you’re creating backgrounds, so you automatically know exactly what the depth of field is, instead of best-guessing and then letting the DP decide what it should be. 

Some of the data sent by the lens

This is why lens metadata is important, because it’s not just for the backend. Believe me when I say that, for $300 million a picture, you could save millions of dollars if you have more data in visual effects, it’s just how it works. But even in virtual production, it’s necessary now to have even better data, better communication between camera lenses and the backend, the computer systems, to get a more accurate image upfront because that’s what it’s all about. In virtual production, it’s about skipping the backend.

We could go on, but maybe you’d like to just read the articles (produced by SONY CINEMA) yourself


Read the full article HERE


Read the full article HERE

Unfortunately, these interviews were given before Snehal hooked up with The Factory for the various MARK ZERO tests – had he given the the interview after that, he’d be gushing about SYNCRO-LINK (like everyone else) – but, you don’t need to trust us – call him / write him – he’ll tell you what he thinks of the product – and he’ll tell you in real time! 

With virtual productions using live action cameras for AR, Simulcam, VR and XR, he understands the need perfect sync between the virtual camera and the live action camera. Accurate metadata from the camera system needs to be streamed to the virtual system with minimal delay.

With many modern lenses now supporting direct metadata connections, it is easier than ever to stream high accuracy metadata in perfect sync with the image capture, without relying on the camera body or camera tracking system to transport the lens metadata. 

Using the same video genlock provided to the camera, the SYNCRO-LINK MARK-ZERO, synchronized to the camera shutter, gives you ZERO DELAY SHUTTER-SYNCED LIVE STREAMING  OF METADATA  OVER  ETHERNET … and that’s what Snehal is talking about! 

Enjoy the articles.

Excuse us, but – that was our PLUG for SYNCRO-LINK!