captainsolo said:
...truth be told I have never seen a home disc or digital print look better than 35mm no matter the condition.
I think I may be able to at least partially explain the divergence between the studies' numbers and peoples' experiences, but it may take someone with a photography background to flesh in the details and/or point to the studies I'll be referring to. Numbers are not entirely pulled out of my ass, but they're from my memory which is about as bad, sorry.
Back when digital cameras (we're talking still images, now) started going mainstream, there was a lot of discussion about when they'd finally overtake 35mm quality-wise. Someone finally did a study on the smallest resolvable detail on 35mm vs digital and determined that (IIRC) ~4 megapixels was equivalent to 35mm. This result, of course, was laughed out of the room--because anyone could easily tell the difference between a 35mm still and a 4MP still, and the 4MP image looked like crap relatively speaking.
But it turned out that there was a meaningful difference between the two standards: "smallest resolvable detail" vs. "ability to tell the overall difference". Because digital images have pixels aligned in a grid and analog images do not, it takes something on the order of 3x to 4x as many digital pixels to create an image that could not be distinguished from 35mm (the human eye can perceive aligned pixels and interprets them as lower quality than unaligned pixels at the same resolution). Both camps still considered themselves correct because they technically answered their question accurately--but nobody sells 4 megapixel still cameras anymore.
Anyway, thought this or something like it could apply here.