In March 2018, I shot and edited a stereoscopic 3D-180 mini-documentary about Bob Kramer, one of the most well-known kitchen knife makers in the world. This is a case study about the project, which was conceived to develop a workflow to enable a single person to generate high-quality stereoscopic video content for VR headsets. Also, Bob’s work is gorgeous, and it was a good opportunity to create an experience to describe what he does!
In February, I accompanied sloth scientist Bryson Voirin to Panama in search of sloths. Armed with a stereoscopic 180 camera and an ambisonic audio recorder, my goal was to capture footage of sloths in the most immersive way possible. The result is a short video that includes footage nearly 100' up a fig tree and makes you feel like you're exploring the rainforest with Dr. Voirin, in search of sloths.
I've really been enjoying using my Surface Book 2, but it's Adobe Creative Cloud performance has been terrible. In Adobe Premiere Pro and Adobe Media Encoder, hardware acceleration (GPU) options were all disabled, which means that doing things like encoding video were being done on the CPU. My 15" Surface Book 2 has two GPUs: an integrated Intel UHD Graphics 620 adapter, and an NVIDIA GeForce GTX 1060. It seemed to me that Adobe products were defaulting to the Intel driver, which isn't able to be selected for hardware acceleration.
I recently switched to the Microsoft Surface Book 2, and I really like it. I got the Surface Dock and have it working with 2 large external displays, wired Ethernet, and a keyboard and mouse. Is this too good to be true??
My talk about 360 video at Oculus Connect 4 is now online! It's a high-level summary about the current state of 360 video including equipment and workflow, followed by a chat with Paul Raphaël and Ryan Horrigan of Félix & Paul Studios about the making of MIYUBI.
I did some experiments yesterday about how one might do picture-in-picture in VR180 (stereoscopic 180). The video is made for VR headsets, so I'm only providing a link to download the video instead of uploading it as a stereo 360
The monoscopic bullet time video created by the Insta360 ONE camera can easily be converted to a 3D video by showing it separately to each eye with a time offset (1 frame). Because the camera is in motion (mostly) horizontally, a time delay creates a virtual left and right eye's point of view.
I posted a video of my son the other day, and I was able to both shoot and participate in the video. I shot the video using an Insta360 ONE 360 camera on a light stand and "re-shot" the video as traditional video using Insta360's "FreeCapture" feature in the iOS app. All I did was frame the video in real time while watching it in FreeCapture mode and export it as a traditional video. It worked pretty well, but export took many minutes, which was challenging to complete on a smartphone (if it goes to sleep, the camera turns off, and it cancels the export).