lurker77 said:
^ That article explains perfectly what I was concerned about when I heard these guys were using an off-the-shelf digital camera instead of a professional scanner.
Bayer filter chips use fancy algorithms to cheat on image resolution, leading to colour inaccuracy. It's not horrible, but it's not the real thing.
Top-of-the-line film scanners use three sensors, one for each color. No cheating, full accuracy. But they cost millions of dollars, thanks not only to the extra sensors, but elaborate film gates, precise alignment of the sensors, an elaborate prism to split the colours between the sensors, and most of all, insanely good build quality.
The next best thing that can be done is something that the article touches on - oversampling. Simply, if you use a 12 megapixel digital camera to capture a 4K image, it will turn out almost as good as a 3-chip scanner. Today's DSLRs do 12 megapixels.
EDIT: My mistake. 12 megapixels is not the same thing as 12,000 pixels per line. That would be 48 megapixels, beyond what any off-the-shelf camera can do ATM.
Millions of dollars?? What scanners are those, I'd like to read about them. That's obviously not practical, but this project will provide us with great results that will sustain us for years, and by the time house sized tv's are common maybe scanners will have advanced too and gotten cheaper. When I worked at a camera store in 2000 a one megapixel digital camera cost hundreds of dollars. You could have gotten a nikon f100 for not much more (which was much much better).
The only improvement I would suggest to their current setup is to use a slr, if only because they shoot faster and have larger apertures (and arguably better focus). And you can get 18mp slrs now for $800 or so. I got my 8mp rebel xt for $200 used and that was years ago.