There are many reviews online that say the Canon DLSRs, which are fantastic still cameras, are lacking in their video mode 1920x1080p resolution. Several web sites calim that the Canon DSLRs (not including the 5D) in 1080p mode are only a little better than 720p when tested on resolution charts. Online video examples show that the 1080p resolution is not the 1080p resolution of a true video camera.
And then there are the video resolution charts.
Why might this be going on?
How do they down sample a 5184 x 2192 (16:9) raw image (using the 60d specification here – the 5D has slightly larger numbers)?
Here is my technical explanation for a possible reason, if the online reviews are correct about the soft video images. (I do not own a Canon DSLR so cannot test this myself.) My explanation may very well be off in space but provides a plausible technical reason. But I can’t say this enough: my explanation may totally wrong! And keep in mind that most people do not even see a difference between 1080p and 720p video on their home HDTV. (And my explanation is wrong in the details – each “photo site” has 2 green, 1 blue and 1 red image sensor. Translating this into megapixel images is more complex than in my simplified explanation.)
The image sensor on the camera is much higher resolution than 1920×1080 used in video. The original image must be converted or downsized to 1920×1080.
They likely take every 2nd row and every 4th pixel across and throw away the extra rows and pixels. In 16:9 aspect ratio, the camera (60D) has a resolution of 5184 x 2192.
If this raster is simplified by pulling out every 4 pixel and every other row, this yields, for 1080p:
5184/4 = 2592 pixels wide (or alternatively, take every 2 of 3 for 1,728)
2192/2 = 1096 rows high
To convert this into 1920x1080p, they might do a simple weighted average of pixels horizontally across each row (the 2592 wide row or up size the 1728) to produce a 1920 row. This is going to soften the image horizontally but is an easy way to get to 1920. Dropping out rows, too, loses information and other processing adds information that wasn’t there (by averaging multiple elements) creating moire and aliasing. The result is an image that looks better than 720p but softer than true 1080p.
(There’s a reference in a comment here that the 5D samples every 3rd row and another comment saying it throws every 3rd row … take your pick!).
720p is much simpler:
5184/4 = 1296
2192/4 = 728
which is so close to 1280×720 that all they need do is throw out some pixels at top and bottom. This creates worse moire because even more rows are thrown away.
The Canon 720p should look fine except for moire. The Canon 1080p is going to look softer than true 1920x1080p and introduces aliasing artifacts.
With only one DIGIC 4 processor on board, they likely lack the processing capacity to do a clean conversion from 5184×2192 sensor down to 1920×1080, 30 times per second! Instead they take shortcuts to make it work.
My prediction: Starting this fall, the Mark III has a dual core processor and all cameras announced from then on will have dual core processors. Within 1 year, Canon will solve the moire problem.
Some of the cameras from competitors have dual core or even tri-core processors now.
DSLRs do enable photographers to shoot very narrow depth of field video, which would be harder to achieve on all but high end professional video cameras that are very expensive – or by using various schemes, like the Letus adapter, that project the image on to a screen.
DSLRs also generally have very good low light performance capabilities.
Ultimately, resolution is hardly the only criteria for shooting with a DSLR. Convenience, size, low light, depth of field, lens quality and that you just like how the image looks anyway may be more important.