dark_jedi said:
AH, sorry but you are mistaken, I have ripped plenty of BD50's to my hard drive and it takes about 30-40 minutes, if it takes longer than that, then you have a shitty PC and an even worse Blu-ray drive\Burner, or you have no idea what you are doing.
I'm running a Quad core AMD with 8 gigs of ram on a 64 bit operating system. I takes longer than 40 min to rip a BD on my system. It could be the BD drive thats causing the problem, but I'm not going to replace it. And besides all that, I really don't do it all that much.
Darth Editous said:
They have not been modified at all - cherry-picked to prove a point, perhaps. I chose a reasonable amount of compression for the jpeg and they are an accurate depiction of those two sections of an HDTV frame and a BD frame.
Those Jpgs really don't mean much, and you know better than to do that, I want you to show a 720p video example, rip vs retail, side by side.
Darth Editous said:
"In the United States, 720p is the preferred format for the broadcast and cable networks of Fox/FX/Fox Sports Net, ABC/Disney Channel/ESPN, A&E Television Networks, Ion Television, MLB Network, and DirecTV's Audience Network."
Thats true, but however, not all TV sets do 720p. Those networks still broadcast in 1080i. 720p may be the preferred format of those networks but they still have to accommodate the viewers that can only decode ( or to stupid to know otherwise ) 1080i, because there are still a few existing HD sets that only decode 1080i. So what I said was still true, just a little bit one sided.
Darth Editous said:
That's a massive generalisation, and not relevant to already frame-based material.
Actually film-based material would be a better term, and yes a progressive scanned frame will look better than an interlaced frame. You loosed picture quality ( color and detail ) with through the use of interlaced decoding because the frame is split into 2 fields that alternate. Not so with progressive decoding, everything is scanned at once.[/QUOTE]
Darth Editous said:
This is only relevant if the HDTV rips were made from an analogue cable broadcast, which seems extremely unlikely. If you're on digital cable and you can see a difference in picture quality with or without a line conditioner... well, I'll stop there, because you just can't, but I have some Monster cables you can buy off me.
(edited to add: actually this is probably not quite right. Line hum can cause waves and other distortions to get through to the TV via earth, but if you're capturing digitally, it won't make any difference)
Get rid of your monster cables and go with something that has better quality terminators. Ones and zeros are one thing, most any cable will get you all of that. The problem is how the cables are manufactured. If they're cheap, then they are made cheap, and they will have cheap connectors, and cheap connector does make a difference. Monster cable isn't really all that bad, but I have seen alot better. The better ones will have more silver in the connectors. Audioquest has a great high performance line, that beats monster hands down. Tributaries make a great high performance cable also, as well as Kimber cable.
Darth Editous said:
No, it gives you a perceived resolution of about 1920x756, if you're talking about truly interlaced material. For progressive material you get a perceived and actual resolution of 1920x1080 (barring some minor differences in how colour is processed).
Well if your talking actual pixels on the screen then your 1920x 756 is wrong. Should be 1920 x 754 for 1,446,680 - 2.35