Field of View: How Sony’s A900 (and More to Come) Change the...

Field of View: How Sony’s A900 (and More to Come) Change the Game

1426

24 Megapixels and Counting

Having just finished shooting a card full of images with Sony’s new Alpha 900 camera I can honestly say that this is a brave new world, one where past file sizes from digital cameras will never be the same. When opened, the images yield almost 70MB of information, something unheard of except with high-end digital backs reserved for pros with their medium and large format cameras. Priced, as of this writing, at about $3000, Sony does not claim this to be a pro camera; rather, they see it as a camera for the enthusiast who wants the highest available image quality, especially when paired with the Carl Zeiss lenses Sony proudly suggest for their camera.

In the past, that kind of pixel power would usher the camera immediately into the pro ranks. True, the Sony has a lot more going for it than mere pixel-packing power, such as an impressive image processor that makes ISO 800 seem like ISO 200, a dynamic range feature that opens up shadows in-camera like the best Adobe has to offer, plus a wrinkle on LiveView that’s more like a pre-exposure preview that to me makes lots of sense. But you have to admit that putting 70MB at the touch of the shutter release button into the hands of amateurs, be they enthusiast or merely well heeled, changes both expectations and how folks will be dealing with all manner of image related tasks. The camera challenges a number of “habits” that digital photographers have adopted, and will now change the size and number of memory cards they need to carry, the increased need for enhanced file management and the necessity of upgrading the image processing power in their computers to process, print and share after the downloading is done.

Pixel Envy

Perhaps we have digicams, or integral lens digital cameras, to blame. After all, with 10MP now being the norm, and 12MP and higher being touted in the coming year, it’s no wonder that DSLR owners are looking at their 12 and even 14MP cameras and feeling left out. Sure, sensor size, pixel size and image quality are tied together, but try explaining signal-to-noise ratio to most end users and they will get that faraway stare. Users know that so-called full-frame sensors in their DSLR somehow produce better image quality than APS-C sensors, but when you delve into the sophistication of image processors and bring in the variable of ISO settings it can get tricky. And for most folks who are not spending nights making 13×19-inch prints, the differences between a 10 and 14MP sensor are not that great. But when you bring a 24MP sensor to the table, they know something is up and pay heed.

This is not to denigrate the buying public or their lack of sophistication in this market. Heaven knows they do their research, and the level of questions I encounter at workshops and in class always floors me. But the assumption that more megapixels equal better quality is something the industry has been pushing all along, and now the chickens of that marketing approach are coming home to roost. It means that somehow an excellent 14MP camera, which produces more information and inherent image quality than most folks will ever use, will somehow be considered less viable than one with a higher MP count. And while you could make the argument that yes, there is more to a higher MP camera, the question becomes how high does the industry have to go before someone finally comments about the emperor’s new clothes.

File Bloat

Just what does a 70MB per image file size mean to the average customer? Think about how we have oriented our software, hardware and accessories to the pre-70MB world. I just got a portable picture storage device with a lovely 4-inch screen on my desk for review with an 80GB drive capacity. In the past I would have thought this device capable of handling a season’s images; now I’d probably opt for the higher 160GB device as a minimum. When buying a computer, users will have to adapt by getting more RAM installed right out of the gate. Backup drives will need to be in TB (terabyte) territory. Software will have to be able to handle huge individual files, and should the user opt for higher bit depth capture who knows how long it will take to just adjust color balance with an “older” machine. And card manufacturers better start pushing out 16GB cards as the base minimum capacity. As I understand it FAT32 handles up to 32GB. Does that mean everything will have to swap up again and we’ll need a new system to handle ever-greater capacity cards? And pity the poor online processor and sharing service that has to start crunching 70MB JPEGs from those not having a clue as to how to resize their images, and how many days those with a major MP camera and dial-up service will wait to get one image on Flickr.

The point here is not to push the panic button or decry progress. It is more to call attention to a direction some segments are taking and the changes this mode of marketing and development implies. If you look at the imaging industry from afar it looks like four men in a horse costume who have not rehearsed their act. One leg pushes forward while the other lags behind; the head bobs backwards while the hind legs rear into the air. Everyone is constrained by the costume—the infrastructure—and all want to be in on the act, but somehow the gait never seems right. But you have to admit that while not graceful somehow the horse lumbers around the track.

There’s no question that when you shoot with a high MP camera there is a sort of visceral thrill in experiencing the possibilities such devices afford. There’s a newfound ability to crop and maintain a large print size; to make images with sizes once only available to high-end industrial and pro shooters; to feel the potential lack of most restraints on what you can do with the images later. There’s also the satisfaction of seeing the impressive developments in image processing, with brute processors that can handle such massive files, all built into a classic 35mm form factor. These in-camera microprocessors can process multiple images per second within the blink of an eye. There’s the sheer appreciation of how far we’ve come, and the potential of what might come in the future.

Uncharted Territory

It feels, though, as if we are on some cusp, some edge of development that might break completely with the way we did things in the past, with expectations of brand new ways of how we went about capturing, processing and sharing images. We are now at a stage where eking out incremental increases in megapixels, or designing clever plug-ins and image processing shortcuts for our current level of file sizes will no longer be tenable.

When it comes to how all that will manifest itself, and what the next wave will bring, your guess is as good as mine. All I know is that when we get into this kind of megapixel territory the entire infrastructure will have some catching up (again) to do before we can take the next step. The past five years have been a whirlwind and given the current pace of change all bets for the near future seem off. I have a sense that we ain’t seen nothin’ yet, but I also have faith that as an industry we will adapt and change to the technology that’s right around the corner. The genie is out of the bottle and we have to reshape our thinking to keep pace. If we don’t, that horse may be even more ungainly than it appears now.

NO COMMENTS