Tag Archives: 3D film

3D Video image stabilization

When shooting 3D video using two cameras, we turn off all in camera stabilization features. This makes hand held video look shaky, unfortunately, as all the little hand movements remain in the original clips. Consequently, we have to shoot most 3D on a tripod.

I just did a test, though, using my Canon Vixia HF M301 video cameras. I use two of them to shoot stereoscopic 3D. The HF M301 has three stabilization options: off, standard and dynamic. For my tests, I shot a scene using standard stabilization and another scene using dynamic. Then I paired the left and right tracks in Magix Movie Edit, output to a WMV 3D anaglyph file and watched the test video.

For general hand held shots without rapid camera movements, the in camera stabilization tracked well between both cameras. However, rapid camera movements caused different stabilization effects in each camera and the 3D goes bonkers as one goes left and the other goes right!

Bottom line: for simple handheld shots without a lot of movement, the in camera stabilization works okay and eliminates the handheld jerkiness.

Ideally, it would be nice if we could stabilize the 3D images after editing but there is not an easy way to do that.

Most video editing software today has image stabilization capabilities. Image stabilization analyzes the video for jerky movements and then corrects by shifting the video around to minimize the jerky movements. Since this tends to leave black bars at the top, bottom or sides, the video is also enlarged slightly to fill in the black bars left by shifting the video.

When it comes to 3D video, there does not seem to be a great solution. Sony Movie Edit Platinum 11 disables the stabilization feature on paired 3D clips. Magix Movie Edit Pro MX Plus can stabilize individual clips before pairing – but there is not a good way to match the stabilization between the left and right tracks. You can stabilize one, copy the effects track and paste it to the 2nd track, but this has not produced the expected result of matching stabilization.

I suppose we could created our 3D track, output to a video file, import that file as a single clip that may as well be 2D as far as the editor is concerned, and then apply stabilization. Not sure I want to go through two more transcodes though!

Enhanced by Zemanta

The 3D rig I used for a Civil War Battle Re-enactment 3D video

I used two Lumix GH-2 DSLRs to shoot 3D video. In this configuration, the lens centers are spaced about 6 inches apart which means that primary subjects need to be at least 15 feet and preferably 20 feet away from the camera to avoid 3D eye strain (convergence going bonkers).

The cameras are screwed into a rail made of aluminum using 1/4-20 knobs picked up at a local hardware store. Audio is recorded using two shot gun mics (bought off EBay, and yes, one has a homemade wind muff) and feeding into a factory refurbished Beachtek audio mixer which connects to one of the cameras. The shot gun mics make a HUGE difference in audio quality. Unfortunately, for the first battle I had left my XLR mic cables in my car so recorded the good audio only on the 2nd battle.

The six inch lens spacing is okay for outdoor landscapes and events where the subject is typically 20+ feet away. But experimenting with the Kodak Playsport cameras that I picked up used, I find the 3″ lens spacing produces a much more pleasing 3D effect. Probably not a big surprise, after all, what’s the spacing between your eyes? Probably not six inches!

But the GH-2 does have some nice features – like being able to align the zooms on the two cameras pretty easily (and good enough). Plus for video, the 14-42mm stock lens provides multiple focal lengths. Not only the 14-42mm range, but also the “Extended Telephoto” mode that crops the 1080 field out of the full image sensor, roughly doubling the focal length at 1080p.

Time permitting, I hope to produce a tutorial on shooting 3D with ordinary consumer cameras. From what I have seen, consumer oriented 3D “all in one” cameras do not deliver the video quality that interest me. While its a little more work, two consumer cameras can deliver surprisingly good 3D results, and results that are clearly better than the “all in one” approach. Plus, I can use external mics which most low end cameras do not support.

In theory, we are supposed to time synch the two cameras for precise frame alignment. But for most activities and viewing on the computer monitor or even the HDTV, frame level synchronization while editing seems plenty adequate. I am not shooting for the local movie theater or IMAX screen!

How do I synch the two cameras? I just snap my fingers to put a pulse on the audio tracks (or in the case of the Civil War battle, musket fire works quite nicely too). Then in the 3D editing software, I use the audio track pulses to align the video segments.

Enhanced by Zemanta

Another 3D mobile phone announced

LG Reaffirms Commitment to 3D with Optimus 3D Max.

From a photography or video perspective, we may not be paying sufficient attention to the mobile smart phone category. There are now quite a few mobile phones featuring glasses-free 3D displays, and I also saw some 3D tablet demos at the 2012 Consumer Electronics Show. The latter were also glasses-free.

The problem is – what do people do with a 3D phone? There’s not a lot of content and may be only a few games that might be able to support 3D. Is watching a video, let alone a 3D video, something we want to do on our smart phone?

Enhanced by Zemanta