Immersive Video Production
Will it be the future of media?
I might be a little late to the party, but I was excited to learn that Apple has teamed up with Blackmagic Design again, this time for the world’s first commercial camera system and editing software for Apple Immersive Video on Apple Vision Pro, following their initial collaboration on the Blackmagic eGPU many years ago.
For those who might not know, Apple Immersive Video, according to Apple, is a cutting-edge storytelling medium that uses 8K, 3D video with a 180-degree field of view, and Spatial Audio to immerse viewers right into the scene. Apple tends to act as if they’re the first in the market, even though the industry has been producing 3D films for VR headsets and using surround sound for ages.
Despite my little rant, I’m genuinely excited. A specialized camera with a dual custom lens system designed for Apple Immersive Video could be a game-changer for content creators everywhere. This camera boasts 8K stereoscopic 3D image capture at a resolution of 8160 x 7200 per eye, with pixel-level synchronization and an impressive 16 stops of dynamic range. It’s not just powerful but also lightweight, with a durable body and standard industry connections. Featuring Blackmagic’s Generation 5 Color Science and a new film curve, the camera records both lenses at 90fps into a single Blackmagic RAW file. It also includes the new Blackmagic Media Module with 8TB for recording, high-speed Wi-Fi, 10G Ethernet, and mobile data connections, and comes with a studio license for DaVinci Resolve to handle all your post-production work.
These days I don’t do a lot in the world of events, but one of my first thoughts was, Could this be used for live production? Imagine being able to put on your VR headset and experience an event as if you were there not just as a static observer like most VR experiences of events but watching a full-on multicamera rendition of the event like you would on your TV.
There are a few challenges to address, the first being for viewers without VR headsets. It would be ideal to have multiple camera feeds, including two for VR (one for each eye), and an additional feed for a single-lens view or a duplicated feed of one of the eyes at the switcher for non-VR viewers.
The issue of switching is also significant. Currently, I’m not aware of any hardware switcher capable of mixing multiple VR camera feeds in real-time for live broadcasting, nor one that outputs both 2D and 3D feeds—although the 2D could potentially use the output from just one of the lenses. Dependence on in-camera recording seems necessary, as there appears to be no device capable of recording both feeds into a single file for later post-production.
It’s possible that 12G SDI feeds could provide side-by-side left and right feeds, which would then be mixed as usual, but this would result in a 3D-only stream. Since the cameras in question have not yet been released to the public, and pricing information is not available, it seems waiting for reviews is the only option to discover what solutions might exist.
If Apple and Blackmagic overlook the potential of live events with their current or future cameras, it could be a significant oversight. Apple could revolutionize its Apple TV+ platform by offering exclusive Live VR content, surpassing other streaming services.
The prospects for live events and VR are thrilling, with the potential to evolve from simple multi-camera streams to immersive experiences enhanced by Unreal Engine, complete with interactive 3D effects, holograms and so much more.
Am I too ambitious in my dreams? Would you be interested in seeing companies like Apple or Blackmagic explore this path? Or have I missed the mark? Share your thoughts in the comments or respond to the social media post that led you here.