Whale shark feeding in 3D

Here is a 3D video of a whale shark feeding at the surface during a huge whale shark aggregation in Isla Mujeres, Mexico. I shot it on August 15, 2011, using a [GoPro 3D HERO System](http://gopro.com/3D) and an [Eye of Mine 3D flat lens housing](http://www.eyeofmine.com/gopro/order-gopro-hero.html#euwl3d) (a flat-lens solution is required for a GoPro to focus properly underwater). The video is best viewed at 720p in some sort of 3D mode.

If you own a 3D display at home, you can [download a higher-quality side-by-side version](https://www.yousendit.com/download/ZUd2K0d1ZDVBNkUwTVE9PQ) for local display (~99MB; link is good for 500 downloads; if it fails, please [let me know](/contact)). The downloadable video is still highly-compressed and doesn't quite convey the same 3D coolness that original version does, but it is still effective!

Wetpixel Isla Mujeres Expedition 2011, Days 1-3

Day 3 of the 3rd Wetpixel whale shark expedition in Isla Mujeres, Mexico: We've had 3 days of whale shark action so far, and each day has given us something different (but spectacular). The first day, a couple hundred whale sharks were spread out in a rather long stretch of the glassy-calm ocean. The water was relatively clear, considering that it was completely full of transparent tunny eggs from the mass-spawning event three nights earlier. Whale sharks gulped down eggs around us from 8am until our boat left (at 1:30pm). On the second day, we discovered a small patch of ocean with hundreds of tightly-packed whale sharks. They were so dense that they were forced to feed in layers, and we saw as many sharks ascending and descending as we did on the surface of the ocean (very rare). Our guides were totally excited, saying that the ocean was infestado with whale sharks. After thirty minutes of total whale-shark insanity, the sharks vanished in a coordinated descent into the depths—it was totally bizarre. One minute, we were surrounded by literally hundreds of sharks, and the next, there were only a few left on the surface. All of us, including the local guides, were totally dumbfounded by the strange behavior.

Today (day 3), we found the sharks 4 miles east and 2 miles south of where they were yesterday. It took a coordinated search effort by multiple boats to find them (which took 3.5 hours on the water), and we weren't in the water until 9:45am. The action was fantastic, with botellas almost literally everywhere we looked (a botella is a stationary whale shark that is vertical in the water, "gulping" water constantly to feed.

I've been shooting with both a Nauticam-housed Canon 7D with Tokina 10-17 fisheye zoom lens, and with a 3D GoPro HERO setup (with Eye of Mine 3D underwater GoPro housing). The 3D GoPro setup has been yielding some very interesting footage because I can get the camera in places where a big housing can never go (e.g. right in front of a whale shark that is cruising at speed). I have some interesting 3D footage that I'd love to present, but two failed upload attempts to YouTube are enough; I'll upload when I return to the States.

In the meantime, here's a 3D screen-grab from the video (red/cyan 3D glasses required):

3D whale shark gulp with GoPro 3D HERO camera / Eye of Mine 3D underwater housing

I also have cute / precious footage of Kieran Liu (the 5-year-old son of my friends Kenny and Lori) swimming madly after a whale shark (and managing to get really, really close). He is fearless!

**Update:** here are links to the videos:

- [Kieran Liu swims with a whale shark](/journal/2011/08/20/kieran-liu-age-5-swims-with-a-whale-shark/) - [3D whale shark feeding video](/journal/2011/08/20/whale-shark-feeding-in-3d/)

GoPro 3D HERO Review

GoPro 3D Hero housing with two GoPro HD HERO cameras installed

A few days ago, GoPro announced the [GoPro 3D HERO System](http://gopro.com/hd-hero-accessories/3d-hero-system/), the world’s smallest 1080p 3D camera. The 3D Hero System consists of a housing that accepts and aligns two GoPro HD HERO cameras for 3D capture, and GoPro CineForm Studio software, which is available as a free download on GoPro’s website.

GoPro sent me a 3D HERO to test, and it arrived today. [Full review over at Noodletron](http://noodletron.com/2011/460/).

More Quicktime X evilness: it overscans (which affects side-by-side 3D)

I hooked my MacBook Pro to a Panasonic 58" 3D plasma display today and fed it a 3D video in the form of side-by-side video content. The good news is that the display showed a perfectly good 3D image when told to expect side-by-side content (yay!). However, there was a problem with my stereo 3D video: for some reason, all of the 3D content was pushed way back past the stereo window. The only way this would happen is if the left and right video content were diverged; upon closer inspection, it was obvious that this was the case. At first, I though I might have screwed up my video export, so I exported it again. The problem remained. On a whim, I fired up Quicktime 7 to play the side-by-side video, and the problem went away! I ran some more tests, and it turns out that by default, Quicktime X overscans! That's right -- it cuts off video content all around the edges.


Quicktime output of side-by-side content

Zoomed: overscan is obvious


Inner red rectangle is Quicktime X's cropped output

Normally, one might not notice a small percentage of video content disappearing (in fact, it is standard practice in TV), but in side-by-side stereo 3D content, overscanning has the effect of diverging the image, pushing it back through the stereo window. The left part of the left eye's video (on the left side of the screen) is cut off, the right part of the right eye's video is cut off, and the new full images are overlaid, resulting in an introduced horizontal divergence.

Combined with Quicktime X's propensity to [screw up anaglyph 3D video colors](/journal/2010/09/22/quicktime-x-and-colorsync-screw-up-anaglyph-3d/), stereographers should just stay away from Quicktime X altogether. I would just delete it from my system, but when I do that, double clicking on a Quicktime movie opens Final Cut Pro instead of Quicktime 7. There is no apparent way to change the default application for file types in Mac OS X[^1], and opening Final Cut Pro is not good default behavior.

In hindsight, Quicktime X's overscan was obvious all along. When I open a 1920x1080 video and hit cmd-i for video info, I see this:

I always wondered what the "1888 x 1062 (Actual)" current size meant. Now I know that Quicktime X is throwing away 32 horizontal pixels and 18 vertical pixels (1.67% of each axis). If you change the display size of the video, the current size changes, but it never shows more than 1888 x 1062 of the actual content -- it just scales up the existing crop.

Once again, Quicktime X proves itself as a huge waste of time for professional users.

[^1]: Changing the default program only does so for the specific file you are modifying.

Ordering 3D prints from FinePix REAL 3D W3 and W1

Fuji's [product page](http://www.fujifilm.com/products/3d/camera/finepix_real3dw1/features/index.html) for the FinePix REAL 3D W1 lists a "High-resolution 3D Print System" which W1 owners can theoretically take advantage of for ordering 3D lenticular prints. I tried to order a couple 3D prints today, and I'd bet money that Fuji barely ever sells any 3D prints. I loaded up Fuji's MyFinePix Studio, the (terrible) piece of software that comes with the Fuji W3, but when I tried to order a print using a 3D MPO file as a source, it told me that I couldn't print 3D images. So I clicked on the icon for online print orders, which took me to [See Here](http://www.seehere.com/), which is presumably Fuji's print provider. At See Here, you can upload MPO files and add them to your cart. However, by default, you're ordering standard 2D prints. This is not obvious. At the bottom of the Order Prints page is a little link that says, "Click here if you would like to order a 3D print."

This link takes you to a [3D Prints](http://www.seehere.com/proProductDetails.do?productId=15570&imgEditImageSubSID=4) page.


Start here for 3D prints (and come back here for each new print you want to add)

Click on "Create Now," and you're taken to an options page with no options other than to "Continue."


A truly useful page in the ordering process

Upload some images in Composer view, add a photo, and then add to cart.


The Composer. Lets you add exactly one image to your order.

You're now stuck in the cart with no option to add another image. If you go back using the back button and add another image to the cart, it simply replaces the item that is already in your cart. To add a second 3D image for printing, you have to manually go back to the [3D Prints](http://www.seehere.com/proProductDetails.do?productId=15570&imgEditImageSubSID=4) page and go through the process again. I recommend keeping the URL for the 3D prints page in your clipboard. After you add each image to your cart, you can simply paste the URL back into your browser's location field and hit enter.


Multiple 3D prints in the cart! A miracle!

I'm sure none of this really matters. At $6.99 per 5x7" 3D lenticular print (-25% for the October sale), Fuji isn't selling very many of these things. Still, it's much cheaper than [other lenticular print providers](http://www.snap3d.com/2-image-lenticulars/s_price_list.html), and there aren't very many options out there...

Peter's empty lot barbecue

I processed one of the [Fuji FinePix Real 3D W3](/journal/2010/09/08/fuji-finepix-real-3d-w3-mpo-and-3d-avi-files-on-mac-os-x/) images today using [StereoPhoto Maker](http://stereo.jpn.org/eng/stphmkr/) (running in Parallels).


*3D image of Peter with flames at his ground-breaking celebration (anaglyph red/cyan glasses required). You can also check out the [universal L-R-L version](/journal/images/misc/echeng101002_0265935_lrl.jpg), which looks much better (if you know how to view them)*

The Fuji gives a good 3D effect that looks great on its rear lenticular 3D LCD, but when viewed at a normal size, image quality is horrible. This image above was taken at ISO 200, and is still really crunchy (and has low dynamic range). All I can guess is that it was too expensive to put two high quality cameras lenses, sensors, and processors into one camera body.

Still, it's fun.

Quicktime X and ColorSync screw up anaglyph 3D

I just returned from a trip to the Bahamas, where I shot [and posted](http://echeng.com/journal/tag/bahamas-2010/) a bunch of 3D underwater video encoded for viewing with red/cyan anaglyph 3D glasses. When I arrived at home, I opened one of the 3D videos on my calibrated 30" Dell 3008WFP LCD (connected to a Mac Pro)... and discovered that I could not see any 3D effect. It was quite strange because I could see the 3D effect perfectly when I streamed the same videos from my [Vimeo page](http://vimeo.com/echeng/videos) (on the same machine/display), and everything looks good when played from my MacBook Pro (even when it is attached to an external monitor). I even tried playing the videos back on my 50" plasma display, and the 3D was fine. What I discovered is that if you have a properly calibrated video card / monitor, you may not be able see anaglyph 3D-encoded images and video correctly. When trying to render colors "correctly," ColorSync can change the colors enough to destroy the effect.

Here's a screenshot of a 3D underwater shark video played back in Quicktime X:

Quicktime X or Quicktime 7 (default) -- must view this image / page in color-space aware browser (Safari, Chrome or [modified Firefox](http://www.smmug.org/index.php?q=node/522))

(note that all images on this page will look the same if you are using IE or Firefox because they are not color space aware)

On all of the displays in my house, the 3D effect is NOT there (obviously, you need to be viewing the image while wearing anaglyph 3D ref/cyan glasses).

Now, here's screenshot from Vimeo:


Screenshot played back on the web from Vimeo

I opened Quicktime 7's preferences and checked the "Enable Final Cut Studio color compatibility" checkbox, which states:

> *"When enabled, video is not displayed using ColorSync. Source colors are read with 2.2 gamma and are displayed in a color space with 1.8 gamma."*

Here's the resulting video (screengrab):


Screenshot from Quicktime 7 with FCS color compatibility checked

Unfortunately, Quicktime X *has no preferences whatsoever*. I hated Quicktime X even before I started to do 3D experimentation, and this isn't helping its case.

Maybe there is a way to delete color space information from my Quicktime videos. I need to do some research.

Note: As part of my standard workflow, I use GammaSlamma to remove color profiles from .png files before I upload them to the web (the web is generally not color profile friendly). When I took the Quicktime X screenshot and removed color profile information from the .png, the 3D effect came back! I had to leave color profile information intact in that particular screenshot in order to demonstrate the lack of 3D effect.

Fuji FinePix REAL 3D W3: MPO and 3D-AVI files on Mac OS X

I recently acquired a [Fuji FinePix REAL 3D W3](http://www.fujifilm.com/products/3d/camera/finepix_real3dw3/) point & shoot camera, which is the only (proper) 3D point & shoot camera on the market ([Mark Blum](http://www.undersea3d.com) showed me the W1, its predecessor, some time ago). The camera stores still images in MPO format, which is essentially two JPGs, thumbnails and metadata crammed into a single file. It stores video files in a stacked AVI format called 3D-AVI. New file formats are always challenging to deal with, especially if you're on a Mac. Fuji ships the camera with its FinePix software, and after installing it, I realized that it has no 3D support because the Mac version is two major revisions behind the Windows version!

I went online looking around for MPO and 3D-AVI support for the Mac. Things aren't looking so good.

Every single program for Mac OS X that supports MPO or 3D-AVI files looks like it was written in 1995 by a Windows programmer: they are celebrations of Mac standards violations and unintuitive interfaces! But, hey -- they open the files, and some of them are even free.

**Here are the most functional of the group:**

- [Mac 3D Viewer 1.0](http://mac3dviewer.apple-solutions.net/en/): No anaglyph export, but effective MPO viewer. No folder navigator or drag & drop -- must use unwieldy file open dialog. Slow folder analysis step.

- [RedGreen](http://mac.clauss-net.de/redgreen/): Simple! You open left and right images (easily, by multi-select in open file dialog -- no drag & drop) and merge them into anaglyph. Moving images around to adjust is painfully slow, and there is no fit to window nor zoom keyboard shortcuts, requiring the use of a mouse to select zoom level

- [StereoSplicer (Beta 7)](http://web.me.com/ijunji/Challenge!_REAL_3D_and_Mac_E/Application.html): Extracts left and right plus anaglyph and side by side (parallel or cross view) and lets you save them. Files must be named DSCF0000.MPO (where 0000 can be any number), which breaks most workflows. Also will separate 3D-AVI video files.

- [ExifTool combined with Automater](http://3dphotography.wordpress.com/2009/11/23/extracting-the-two-images-from-fujifilm-real-3ds-mpo-format-on-a-mac/): This is my favorite way of extracting left and right images from an MPO. Using an Automator Workflow, I can right click on a file (or series of files) and select "Extract MPO Files." The script specified in the page I linked above doesn't support a folder structure or files that contain spaces. Here is a Automator service and application that fixes the problem (I am not responsible for these -- use at your own risk!).

- [StereoPhoto Maker](http://stereo.jpn.org/eng/stphmkr/) / [StereoMovie Maker](http://stereo.jpn.org/eng/stvmkr/) (for Windows): by far the best software out there, in terms of functionality. Full 3D-AVI support, too. Tried running in [WineBottler](http://winebottler.kronenberg.org/) -- do not recommend! Extremely slow on MacBook Pro. Took 20 seconds to open a single MPO from Fuji. However, is very responsive in Parallels / Windows XP.

- Fuji FinePix v5 (Windows) in VMWare Fusion / Parallels

**Here are some other options (don't recommend):**

- [AnaBuilder](http://anabuilder.free.fr/indexEN.html) - [Stereomerger](http://www.stereomerger.com/mw/index.php/Welcome_to_Stereomerger)

### Notes about 3D video:

For video, once you have extracted left and right streams, [Dashwood Stereo3D Toolbox](http://www.dashwoodcinemasolutions.com/stereo3dtoolbox.php) works well and is designed so editors who work in Final Cut Studio can add stereoscopic video to their workflow.

I've been experimenting with line by line interleaved (interlaced) 3D video lately, and discovered a few things:

- Quicktime X playback of 1920x1080 line by line interlaced content is wrong (didn't work on my Hyundai W240S monitor). Use Quicktime 7 or another player - Dashwood Stereo3D's line by line interlaced output doesn't work with interlaced video sources - The Hyundai W240S, which is a 24" monitor that supports line by line interlaced 3D using Xpol and circular polarized glasses, displays 3D that is less pronounced than what you get using anaglyph glasses. I'm returning it.

**UPDATE**: The folks over at AlienRat Design have created [a QuickLook plugin for the Fuji FinePix REAL 3D W3](http://alienrat.net/software/mpo.quicklook.html) so you can preview MPO files in Finder.

**UPDATE, OCTOBER 4, 2010**: I've been using [Fuji's MyFinePix Studio](http://www.fujifilm.com/support/digital_cameras/software/myfinepix_studio/index.html) in Parallels / Windows XP, and the thing is pretty terrible. You can't look at a 3D image using the software (anaglyph or any other view). Today, Fuji released v2.0 of MyFinePix Studio for Windows 7 / Vista / XP, which includes the ability to edit 3D movies, divide 3D movies into two 2D movies, and divide MPO files into two JPG images. All of these are rudimentary features that have been long supported by StereoPhoto Maker and StereoVideo Maker.

Moray eels hunting at night (3D anaglyph)


* red/cyan glasses required
3D video (anaglyph red/cyan) of a moral eel hunting at night in the Maldives. Shot underwater with a [custom BS Kinetics underwater housing](/journal/2010/07/21/underwater-3d-stereoscopic-video-housing-unboxing-setup/) for dual Sony CX550V camcorders.

If you would rather see a side-by-side format or row interleaved, check out the [YouTube version of the video](http://www.youtube.com/watch?v=p5ePPMokVFQ).

If you don't have anaglyph 3D glasses and want to see the footage, click through for the left eye view.

Whale shark gulp in 3D slow motion, Isla Mujeres, Mexico

We had in excess of 500 whale sharks (*Rhincodon typus*) at Isla Mujeres today in perfectly clear skies and mirror-flat water. I am speechless... but not speechless enough to try to upload some video (the internet is fast enough tonight for me to get a few videos online).

Click through for 3D version of the video, plus a bonus video of Heidi Connal swimming with a "botella" (bottle) -- a whale shark that is vertical in the water gulping water without moving (well, it rotates, but it stays in the same place).

Heidi with a whale shark:

It is Alexis, Nathalie and Heidi's first time in the water with whale sharks. Do you think they might be a little spoiled? ;)

First underwater 3D stereoscopic video: cenotes diver

For my first underwater 3D shoot, I dove cenote Chac-Mool, which is about 20 KM from Playa Del Carmen (just outside of Cancun). I had never taken my [modified BS Kinetics 3D housing](/journal/2010/07/21/underwater-3d-stereoscopic-video-housing-unboxing-setup/) underwater, so I had to first mount lights on it and do a pool test for buoyancy. Adding 4 lbs made the housing almost exactly neutral, although it is just a little bit back heavy.

I HEART 3D

I was quite worried because a dark cave is not exactly the best place to use a camera system that relies on auto-exposure for its picture. I set the camera to underexpose 3 units (whatever that means in Sony land) and hoped for the best. The results were actually quite good!

Here is the first clip I processed, which shows Mario, our dive guide, swimming through a halocline (the layer between fresh water and salt water, which is more dense). Don't worry -- the fuzzy halocline water clears up after a few seconds.


Use red-cyan classes to see this 3D video and view full screen for best results!

I'll do a more formal write-up about my 3D workflow when it is fully tested, but at the moment, it includes ClipWrap[^1], MPEG Streamclip, PluralEyes, Final Cut Pro, Dashwood Stereo3D Toolbox, and Compressor. I shot 107 clips (214 total, since each camera shoots separately), taking up a total of 16.6 GB of space (8.3 GB x 2).

[^1]: **Updated 17 Sep 2010:** Every once in awhile, ClipWrap leaves the audio track out of re-wrapped AVCHD video from my Sony CX550V camcorder. There is an easy fix for this, which is to open the Perian preference pane, click "Remove Perian" and then (immediately) click "Install Perian." It appears that Perian gets in a bad state and prevents audio from being transcoded properly. I am now in the habit of always checking my re-wrapped video for an audio channel. Once it fails once, it will fail on every successive re-wrap until Perian is removed and re-installed.

3D workflow is *extremely* time-consuming. It took 148 minutes and 37.668 GB of disk space to download all of my day's clips and process a single 55-second clip for upload to Vimeo:

WORKFLOW STEP TIME +DATA -DATA
Download 16.6 GB of AVCHD .mts files (8.3GB x 2) to computer 14 min 16.6 GB
Use ClipWrap to re-wrap 214 AVCHD .mts files as Quicktime .mov 11 min 19 GB -16.6 GB
Preview clips / trash bad clips 30 min -4.53 GB
Convert 2 x 358 MB clips (0:55) to ProRes for testing 35 min 1.34 GB
PluralEyes clip sync 5 min 0.01 GB
Final Cut Pro / Dashwood Stereo3D clip coupling, sequence 5 min
FCP sequence export 1080i ProRes 422 LT 37 min 0.67 GB
Compressor convert to 720p, de-interlace, watermark 11 min 0.048 GB
TOTAL 148 min 37.668 GB -21.13 GB

After deleting all temporary data, I am left with 16.538 GB of new data for the day's work. The master clips (which I backed up onto an external hard disk) are now Quicktime-wrapped AVCHD H.264 video files. I will only convert to ProRes the subclips I choose for whatever little production I make from all of the clips I assemble. It is far too time-consuming and storage-intensive to convert all of my clips to ProRes in the field! I expect both time and data numbers to come down as I start targeting my shots, but since most of the time is actual computer download and rendering time, it probably won't go down that much.

My first underwater 3D was a great success -- I look forward to collecting more 3D footage and sharing them online... although it did take a couple hours for me to process just a single clip. It may take me quite some time to do something more formal.

3D string quartet performance (anaglyph)

3D test (anaglyph -- requires red/cyan glasses) of Quartetto Sugoi in low light using dual Sony CX550V camcorders zoomed in a bit, perhaps, to 40mm (35mm equivalent) or so. Camera sync is approx 16ms apart, which is why there is ghosting when objects move. Video targets large displays (30"+ ideal).

Due to tolerances in manufacturing, it is nearly impossible to get two of these cameras to align perfectly, resulting in the need for rotational geometry correction in post, which may also account for further image degradation (in addition to the low light noise, that is).

Music: Mendelssohn String Quartet No. 6, excerpt from first movement (thanks, Quartetto Sugoi!)

Underwater 3D stereoscopic video housing, unboxing / setup

Custom BS Kinetics carbon fiber underwater 3D housing with Sony camcorders

Custom BS Kinetics carbon fiber underwater 3D housing with Sony camcorders

A few days ago, I took delivery of a BS Kinetics DuoDive housing, which is designed to house 2 consumer camcorders for use in capturing underwater 3D video. The housing is a carbon fiber + epoxy oval (as opposed to being a rigid cylinder or machined aluminum housing, which is more typical), and features a flat port, red/orange filter, hinged port cap, and rear LCD that toggles between left and right camera display.

The housing is designed to be generic, which means that a variety of cameras can be mounted inside. I've decided to use two Sony CX550V AVCHD camcorders with Sony 0.75X wide-angle adapters (removable). Although the CX550V shoots AVCHD (bleh) at 1080i (double bleh), it has a wide lens (~29mm equivalent) and is easily controlled through LANC.

Speaking of LANC, the majority of the camera's controls are controlled using ste-fra LANC, which was developed for the use of two camera to provide stereoscopic imaging. [Werner Bloos of digi-dat](http://www.digi-dat.de/produkte/index_eng.html) integrated ste-frac electronics with the BS Kinetics DuoDive housing.

The rear numeric LED display shows sync information of the two cameras in milliseconds. Through this LANC control, I have access to power, zoom +/-, zoom speed, push autofocus, focus auto/manual toggle, focus +/-, record, mode switch (photo/video), guide frame on/off, and shoot photo. A yellow button switches the LCD view from left to right camera, and two additional buttons kick off infrared macros that blast the menu commands necessary for one-touch white balance and Smooth Slow Motion (a 240fps burst on on the Sony CX550V).

Unfortunately, there is no control for exposure, so I will be forced to shoot in auto (the horror!). I may end up housing the Sony CX550V remote so I can blast infrared menu commands into the housing to control exposure. This would be extremely non-ergonomic, but would at least give me the option to lock down exposure. Also, since so much of the housing relies on custom electronics, I will likely install a vacuum valve so I can pressure test the housing before each dive. A flood would be catastrophic and non-user repairable in the field.

Using the rear control buttons, I find the housing and camera combination to be quite responsive. The buttons are easy to press and there is no lag in camera control. Zoom and focus controls appear to synchronize well, although I'll have to analyze footage to determine how well the camera actually perform together.

The infrared LED blasters took hours to position correctly. The manual states that "the two IR emitters need to be placed as close as possible to the IR receiver of the camcorders," and after hours of frustration, I realized that the emitters actually need to be placed *further* -- around 25mm away from the IR receiver -- in order to reliably control the camera. I suspect that this is due to a narrow beam angle from the IR-LED emitter. Because the area near the front of the mounting plate is so cramped, I had to place the LEDs in a diagonal position still facing the IR receiver on the camera, but not too close. I tried the configuration as pictured in the manual, but that didn't work at all. See the last photos in my gallery for how I ended up positioning the LEDs in order to get reliable infrared control.

In any case, I am all ready for my first series of 3D video dives next week (cenotes in the Yucatán followed by whale shark aggregation off of Isla Mujeres). Wish me luck!

3D stereoscopic video camera, with grip

There is no good way to hold two cameras attached to each other, so I had to come up with another way. Here are some photos of two Sony CX550V camcorders set up for 3D stereoscopic imaging, mounted on Really Right Stuff hardware and an Opteka X-Grip.


Two Sony CX550V camcorders set up for 3D stereoscopic imaging, mounted on Really Right Stuff hardware and an Opteka X-Grip


Two Sony CX550V camcorders set up for 3D stereoscopic imaging, mounted on Really Right Stuff hardware and an Opteka X-Grip


Two Sony CX550V camcorders set up for 3D stereoscopic imaging, mounted on Really Right Stuff hardware and an Opteka X-Grip

3D tests with dual Sony CX550V camcorders


Two Sony CX550V camcorders with wide-angle lenses attached
I was on my way to bed tonight when I decided to do some 3D experiments using my dual Sony CX550V camcorder setup (which I am putting into an underwater housing soon). The two camcorders are mounted on plates and rails from [Really Right Stuff](http://www.reallyrightstuff.com/), and both also have Sony 0.75x wide-angle adapters attached to them. The stereo base (inter-ocular distance) is 67mm.

I recorded a few seconds of video, transcoded the AVCHD into ProRes 422 LT, and imported the clips into Final Cut Pro. It was my first test with Dashwood Cinema Solution's [Stereo3D Toolbox v2](http://www.dashwoodcinemasolutions.com/stereo3dtoolbox.php), and I relied heavily on [the tutorials](http://www.dashwoodcinemasolutions.com/tutorials.php#all) they've put online in order to put left and right video clips together into sequences that I could preview dynamically in side-by-side mode or by using anaglyph (red/cyan) glasses. For my first tests, I adjusted convergence using the Stereo3D Geometry video filter, which offers sliders for adjusting pretty much every geometry attribute in both left and right eyes.

In the images below, you can see screen grabs of video converged at different points in the frame: the music stand in the front, the front of the table, the fruit on the table, and the back wall. As you view images with convergence points further and further back, you'll notice some of the elements (like the front of the table) start to stick out in front of the screen.

Note that I didn't physically converge the cameras with toe-in convergence. Instead, I adjusted convergence in post production. This sort of convergence results in a loss of image in the right and left sides of the frame, but when I have my anaglyph 3D glasses on, I can't really tell.


3D frame grab from video converged on the music stand


3D frame grab from video converged on the front of the table


3D frame grab from video converged on the butternut squash on the table


Side view of 3D test area (music stand, table, entrance to hallway)

You can [download the full-sized images](http://echeng.smugmug.com/3D/20100719-First-test-dual-Sony/12987544_U2gEJ#939728478_Go9tq) for a better look.

2010 NAB Show report

3D mania at NAB Show 2010
Last week, I decided that it would be interesting to check out the [NAB Show](http://www.nabshow.com/2010), an annual tradeshow in Las Vegas put on by the National Association of Broadcasters.

As a NAB Show newbie, I wasn't sure what to expect; I knew that it would be different from the dive shows I typically attend, but I wasn't prepared for just *how* different it ended up being. Everything was shiny, slick, and professional, and there were quite a few people wearing suits. I didn't see any flip flops or swimsuits, and no one was serving rum from beneath fake tiki huts. In the industry SCUBA diving shows I've attended, aisles are often devoid of people, and bored-looking booth workers sit around chatting idly or twiddling their thumbs. NAB was absolutely packed full of people. I could barely make my way through the aisles, and people were jammed right up against each other in the more popular booths.

I won't bother to talk about specific products here because there are no doubt a bunch of NAB show reports on other websites.

Although there were basically no products on display that were designed especially for underwater use, I combed the aisles looking for equipment that might be interesting for underwater videographers and filmmakers. I saw three themes at this year's NAB:

1. Digital SLRs do awesome video. Let's build stuff to support it. 2. SLR sensors are great. We'll put them into video cameras. 3. 3D is here, whether you're ready or not.

Like many people out there, I've been really excited by the video I've been getting from my Canon 5D Mark II and 7D SLRs. The show featured dozens of booths with accessories that make shooting video with an SLR bearable and semi-ergonomic. The first few feature-length movies shot with Canon 5D Mark II cameras are starting to appear, and the last episode of House was shot with a one. But the video guys have decided that they've had enough, and they're starting to put big sensors into their cameras, which means that good video ergonomics will soon be meshed with big, clean sensors. I'm happy to have SLRs drive competition and push features into dedicated video cameras. We'll see features converge from both sides, and prices will plummet because you can already shoot beautiful HD 1080p using a $700 Canon SLR.

2010 is, beyond a doubt, the year of 3D. There were probably hundreds of booths featuring 3D equipment: cameras, camera accessories, camera support, 3D displays, software, and more. I noted with humor that there was a small "3D pavilion" featured in the show guide. It was truly a joke because the entire show was really just one huge 3D pavilion.


People looking cool in their Panasonic 3D glasses

Shooting to target 3D on land seems like it's pretty easy. You stick two cameras together and adjust inter-ocular distance (IO) and convergence. If the IO is smaller than the diameter of your lens, you switch to an orthogonal "beam splitting" setup, with one camera shooting through a 50% mirror, and the other shooting reflected. But post-processing is hard and mysterious. The high-end production folks seem to have it figured out, and there was quite a lot of activity at the large booths with presentations from the few special effects guys with experience in 3D. Prosumer and consumer 3D, however, seems like it's a big mess. Where are the standards? If I were to wake up tomorrow with 10 hours of fantastic 3D footage, how would I deliver it to a stock house? How would I edit it? How would I display it? What sort of signal would it take to deliver it to a 3D display? How would I distribute it?

Who knows.

There were, on display, many dual-camera setups which record separate video streams. Some magic happens using various software or hardware products designed to merge two streams into one, and then it seems to be delivered to what looked like home-brewed 3D monitors via dual HDMI or HD-SDI connections. Even the home-brewed monitors were expensive -- thousands of dollars -- even though they were really just two LCDs placed orthogonally with a 50% mirror in the middle. The whole industry seems to be so crazy about getting into 3D that they are just hacking stuff together, pricing it high, and hoping that someone will buy it.

3D has been proven to be a big money-maker in the movie industry, and the momentum of that industry will push it right into the home. It may take time for 3D televisions to become ubiquitous, as there seem to be many comfort and content issues to contend with. Will Joe Diver really come home from work, put on a pair of 3D glasses, and sit in front of the television? And if so, what 3D content will be available? It seems hard to believe that people will be walking around their homes with dorky, uncomfortable 3D glasses on their faces, and the displays that do not require glasses are a long way from being perfected.


Active 3D glasses

3D does, however, seem to be perfectly natural for computing and hand-held applications. I didn't see any demonstrations of 3D at a smaller scale, but it must be coming. Both computers and hand-held devices are single-viewer platforms that feature generated content (as opposed to filmed content). Knowing that there is only going to be one person in front of a display makes it easy to include lenticular displays that do not require 3D glasses, and generated content is easy to produce in 3D. It's pretty clear to me that it's not going to take very long for every handheld, gaming, and computing platform to go completely 3D.

The underwater imaging industry is typically slow moving, and it is frustrating to wait for our industry to catch up with what is going on in other industries. I didn't see many underwater housings at the show, but I'm told that [Amphibico](http://amphibico) were showing a housing for a Panasonic POV (point of view) system (not 3D). Also, I met up with John Ellerbrock of [Gates Underwater Housings](http://www.gatesunderwaterhousings.com/), who also had a prototype housing for the Panasonic POV available (upon request). John and I had a nice chat about where we thought 3D would go for underwater videographers. He has some interesting ideas, and I'm looking forward to see what Gates comes up with in the coming years.

If you are an underwater housing manufacturer, I urge you to start looking at manufacturing 3D housings. We all need to start experimenting with underwater 3D at a consumer level; it is clearly the future.

There are some really hard problems to solve for underwater folks who are thinking of going 3D. Howard Hall has repeatedly mentioned in interviews that the 3D IMAX system he uses works best on certain kinds of subjects, with clear constraints on size and distance. On land, it is easy to adjust convergence, but it is quite hard to do in underwater setups. Flat ports need to stay perpendicular to the direction the lens is pointing, which makes converging two cameras hard unless they are housed separately, and 3D is also sensitive to inter-ocular distance. Separately housed cameras can push cameras too far apart to shoot subjects that are close. Dome ports are an issue because they distort images in ways that might not make for convincing 3D. Macro is problematic because it requires tiny IO, and mirror-based 3D system would be extremely bulky underwater.

But who knows? It's going to take exhaustive testing to see what does and doesn't work underwater. If you like to experiment and tinker, please make your own underwater 3D housing and start shooting. Think of this as a call to arms!

Wetpixel now has a new forum called [Underwater 3D](http://wetpixel.com/forums/index.php?showtopic=35589) where you all can discuss your thoughts about shooting 3D underwater. We'll see you online!

Special thanks to Mary Lynn Price of [DiveFilmHD](http://divefilmhd.com), who took time out of her workshops to chat and buy me lunch.

[smugmug url="http://photos.echeng.com/hack/feed.mg?Type=gallery&Data=11843870_zXrpP&format=rss200" imagecount="100" start="1" num="100" thumbsize="Th" link="lightbox" captions="true" sort="true" size="L"]