logo Sign In

lordjedi

User Group
Members
Join date
8-Jun-2005
Last activity
9-Apr-2015
Posts
1,640

Post History

Post
#337814
Topic
Windows 7
Time
Tiptup said:

Though, let me just say, again, before I move on, that I'm not talking about XP getting all (or even most) of DirectX 10's improvements (with all of its increased speed and whatever else), I'm just talking about some kind of direct support for a specific hardware-based effect being put into DirectX 9. If that's not an accurate view of something like "Geometry Shading" (since you'd need practically all of Direct X 10 to have it) or if it's simply not possible to implement direct support for that kind of effect in DirectX 9 easily (it would practically need a complete reworking of DirectX 9), then I'll simply take your word on that and move on. All I ask is that you stop wasting my time with arguments that I am not making. Thank you.

Since no one but a Microsoft employee can see the source code of DirectX, we all have to pretty much assume that any new effects that are only available in DirectX 10 would require more effort than is worth for a simple patch.  Since the driver model was also completely rewritten, we also have to assume that any patch would be written in a completely different way for XP and would also be more trouble than it's worth.  It's probably something like a few weeks or months of work versus an hour or two of work.

You see, lordjedi? What the fuck is that top paragraph about there? When did I say that it "doesn't look" to me like Microsoft "put a lot of work" into DX10? I've granted that possibility and virtually said the opposite quite a few times. Why are you making that argument with me? You're wasting my time with stupidity and I don't like stupidity. Please, start reading what I'm saying. Thank you.

Your comments suggested that Vista didn't seem all that different from XP, just slower and buggier.  If that's not what you meant, then I'm sorry.

I just got done helping a friend of mine wipe his whole Vista-run computer because the thing was buggy as hell. It was a Core2 Duo laptop with great hardware specs and yet he couldn't even get through an install of WoW on it. Even after I wiped the hard drive and reinstalled Vista clean (and updated it with SP1), it still ran like crap. The computer restarts at odd points, slows down sometimes, and occasionally gets stuck at blank screens. Other Vista machines (desktops mostly) I've played around with have been buggy in other nasty ways (restarts, freezes, and other crap like that). XP, however, was stable from the day I began using it and is generally stable on every other machine I've seen it installed or played with it on. That's a big difference and an important issue to me. Are you saying that stability shouldn't matter to me?

So he was trying to run WoW on a laptop?  What kind of video card does this laptop have?  Laptops are notoriously bad for games.  Read the system requirements of most any game in the past year.  They'll have a list of video chipsets that are supported, possibly even listing specific cards, and then they'll say "Laptop versions of these chipsets are not supported".

My guess would be, without knowing the full hardware specs, that the laptop is probably getting to hot or needed the video and audio drivers to be updated.  Typically, that's what causes games to either not run or to run very poorly.  Rarely does the problem come from the OS itself.

Again, laptops, even the ones with Nvidia or ATI graphics, really aren't designed for gaming.  It'll usually work if you have one of those chipsets, but you pretty much take your chances.  I'm going to assume you ended up putting XP on the machine and got it to work, even though you didn't state that.

And, I'm sorry, but have you noticed those shitty little messages asking you to confirm every little simple action in Vista? Microsoft didn't even implement a "don't ever ask me about this again" option!

Like I said, I only see these things when I'm installing software or changing system settings.  And they actually do have a setting for not asking you about it again.  It's called turning UAC off :P

It's an incredibly obnoxious feature! From that crappy, needless chore alone I will have to warn you not to argue that Vista is "superior" to XP in every last way. If you try to tell me that it's a good thing for an OS to have because it protects stupid people, I'll simply have to laugh at you. A horribly stunted interface is not a decent trade off for the protection of dumb people.

Laugh at me all you want.  I've even had times where I got the popup and wondered "what the hell?"  At which point I click Deny and then realize that whatever setup program I just tried to run tried to spawn some weird process with a completely different name.  The software's usually ok, but it throws me when it does it.  This is a good thing imo.  I want to know if some program is doing something weird.  I've worked on enough machines to know that if people had some kind of big, scary warning, their machine probably wouldn't have gotten screwed up to begin with.  That popup window alone would scare the crap out of my mom and cause her to call me, asking if whatever it was was safe to do.  Quite frankly, I'd rather get that call than the ones I get now where I hear "It's messed up and I don't know how it happened" because some damn ad on some website looks like a Windows warning message telling her her computer is infected so she clicks on it to clean it up.  The UAC warning would stop it every time.

OS X does something very similar.  When you try to install any software that needs to change system settings, it pops up a dialog asking for the admin password.  Why Vista gets such a bad rap for doing the same thing OS X does I'll never know.

lordjedi said:

Microsoft's "support" for XP hasn't ended.  The only thing that ended is retail availability:

http://www.microsoft.com/windows/lifecycle/default.mspx

Ahh, so are you trying to tell me that Microsoft is "supporting" XP in every way it can? :)

I'm sorry, but even brand new products don't get absolute support and you know that. The question in this debate is what level of support would be an ideal balance for Microsoft to make the most short-term profits while preserving the long-term profits they'd get from happy customers (who won't get angry enough to rework our nation's patent laws for example). Getting cute and saying that there is a non-zero level of support is not helpful to that discussion. You know what I mean by "support" and I'd like to get back to discussing that real issue now. :)

What do you mean by absolute support?  If you have a problem with Vista, a brand new product, you can call Microsoft and receive installation support.  I believe they give you 90 days for free when you buy a copy.  After that, it's $250 per incident.  So what are we calling "absolute support"?

lordjedi said:

XP is outdated and has run its course.  Here's their support timeline:

http://support.microsoft.com/lifecycle/?p1=3223

As you can see, Support is still available for XP.  Just because you can't buy XP from a store (retail availability) or from MS, doesn't mean it isn't supported anymore.

Retail availability is a kind of support. So, by definition, we cannot say that MS is "supporting" XP in at least that way, can we? There are other ways in which it is not being supported anymore too. This is a silly point of yours. ::yawn::

Actually, I can.  http://www.newegg.com/Product/Product.aspx?Item=N82E16837116195  That's a retail copy of Windows XP Professional.  I didn't even know it was still available until yesterday when I received an email asking about it.  I don't know what other ways you're referring to, so I'll continue below.

 

lordjedi said:

You're right, the free market does have many ideas of software support.  Which is why there are companies out there still support NT, even though Microsoft doesn't offer support.

To answer your question, without knowing what that patch fixed, I couldn't say if [Blizzard Entertainment is] stupid or not.  If that patch only took a few man hours to work on, then no, they probably aren't stupid.  But if it took several weeks to do, then yes, I'd wonder why they worked on it.  With the success of WoW, I wouldn't understand them putting any extended effort into any of their legacy products.

The funny thing is, you can't BUY Starcraft from them without getting it as a digital download.  So why don't you go try to buy a 10 year old game (aside from a digital download) and see how easy it is?  I'm sure you could get a used copy, just like you can get a used copy of XP, but I seriously doubt you can find a new unopened box anywhere.

Yes, you're finally talking about different ideas of support. Right. You're catching on to what I actually want to argue here. Good. Thank you.

Blizzard didn't devote a ton of time to getting StarCraft to work in XP (perfectly I might add). They didn't spend a lot of time on the small map updates that pleased StarCraft's large "professional" community (mostly in Korea). Their most recent patch to the game simply removed StarCraft's need for a CD-Key and I bet that didn't take them long either. Even just a few years ago they updated StarCraft to have some interface improvements and I'm willing to bet that wasn't even too much work for them. However, those improvements helped them sell more copies of the game, helped them to keep the "StarCraft" brand popular (in a competitive RTS environment), and generally kept their new and long-term customers happy. Those kinds of support have helped keep Blizzard to be the successful company it is today. They did little things to keep their customers happy.

In terms of buying StarCraft, Blizzard isn't selling it in stores anymore because they can't make money that way (people aren't buying it in stores enough for them to make a profit by selling physical copies). That doesn't bug me at all. That's the free market deciding that it doesn't want to buy new physical copies of the game anymore. However, if I and plenty of others still wanted to purchase new copies of the game in that way (for the appropriate amount of money), I'm sure Blizzard would still sell it like that (since they'd make money doing that), and if they didn't, that would make me as a fan and a customer displeased.

New copies of XP are being sold for a lot of money right now (nobody is doing that with copies of StarCraft). The reason for that is because it is still in high demand by people like me who want to purchase it. However, Microsoft has a high degree of control when it comes to helping force a free market to go against what it actually wants (for whatever reason) and, as a result, they don't really worry about any lost revenue from selling no new copies of XP or lost revenue from making unhappy customers. A company like Blizzard, on the other hand, isn't allowed to make that kind of a move if it wants to remain successful. A company like Blizzard has to work harder and be smarter than that because RTS games are a far less centralized product (and therefore the market is less controlled). In other words, Blizzard has to work harder to give people more and not give them less. Are you saying Microsoft's market behavior is more ideal?

New copies of XP are being sold at a higher price because that's what Microsoft does.  That isn't exclusive to XP either.  Everytime Microsoft releases a new product, the cost of the old product goes up and the new product gets the previous or lower pricing.  That's just the way it is.  That's their way of encouraging people to adopt the new product.  They obviously only want to support the old stuff as long as necessary.  This is also why it gets difficult to find copies of the old stuff after a certain time.

Even Adobe just discontinued their CS3 line.  I had to buy Illustrator CS4 for work because I could not get CS3 from my supplier.

XP was a fantastic operating system in my mind (from its beginning) and Vista has been a much more troublesome experience for me by comparison. Maybe that's not what most people have encountered and therefore my Vista experiences have just been a run of bad luck, but I would have a lot of trouble believing that. That's not a crime on my part and I think the impatience and belligerence you expressed in your earlier posts were uncalled for.

You are correct.  When Vista was first released, it caused a lot of trouble for a lot of people.  But the same thing happened with XP.  At one of my first jobs, when the CFO tried to upgrade to XP, suddenly his printer and scanner would no longer work.  They worked find under Windows 2000, but there was simply no driver available for XP.  Maybe you had a great experience with XP from the getgo, but plenty of people had the same trouble with XP that they had with Vista.

I'm not some anti-Microsoft kook and I am not an idiot. I don't believe I treated you that way in this thread. How would you like it if I had said stuff like this to you:

"You're probably one of those idiots that has always sucked Microsoft's cock in that you've absolutely loved every product and move they've ever made as a company."

First, let me apologize for my previous comments.  You're right.  I shouldn't have said those things.  You're probably the only person I've ever talked to that didn't have trouble with XP on release, which is why I've been saying these things.

Second, I'd tell you that just 10 short years ago, I was a total anti-MS kook.  Everything MS said and did I looked at as just evil.  It was only in the last 6 or so years that I've started to actually like their products.  Windows 9x (including ME) was a piece of shit.  MS deserved all the blame they got for everything back then.

Of course, now I look at some of my OS X zealot friends and wonder what the hell they're thinking.  They always say "It just works".  To them I say "So does mine.  When you pay for quality, you get quality.  The difference is that I didn't need to spend twice the price of a PC to get quality, I just bought quality parts".

Anyway, that last part was more of a tongue in cheek response to the hypothetical question, so feel free to ignore it :)

lordjedi said:

Seriously, I don't care if you have a love affair with Vista. So, why, then, do you feel so keen on lecturing others for not liking Vista? What on earth is making your blood boil so much with this issue? (I have no fucking desire to have a heated debate about Windows for crying out loud.)

I haven't lectured anyone for not liking Vista.  I've asked people what programs they had trouble with.  I've explained that MS hasn't changed their support timeline or the retail availability timeline (aside from extending them) since they released XP.  What I don't like is people bitching about how MS is forcing them to get a newer version.  What I don't like is people saying how easy it would be to make newer graphics effects work in XP, when they clearly have no idea what's involved behind the scenes.

Okay, well, first, I am not being unappreciative of the remaining support that Microsoft still gives to Windows XP. I am not being unappreciative of the fact that each new OS has a realistic lifetime in our free market. I am not unappreciative of all the hard work Microsoft does when it creates a new OS like Vista that is admittedly better in some ways. I certainly don't have a problem with Microsoft trying to make profits with all of the good things they do. I have not argued otherwise and for you to argue with me as if I were is getting really, really old.

Second, to the degree that what I am complaining about is wrong and misinformed on my part, I'm willing to let you (or someone else) correct me. I'll admit that you probably know a hell of a lot more about these issues than I do. However, I will not change my opinions because someone tries to accuse me of being an ingrate.

And, seriously, you have been "lecturing" me for not liking Vista. You've gone out of your way to turn my arguments into absurd straw men before moving on to tell me how supposedly stupid or spoiled I must be for believing them. Nowhere have I made statements as extreme as those portrayals, however. I must conclude, therefore, that you want to attack me for simply having problems with Microsoft and that's weird to me. Do you have some personal stake in Microsoft to explain your sensitivity on this issue?

Nope.  As I said, and you're not going to like this, most of the time when I hear about complaints, it's from people who haven't used it.  Other forums, friends that only use OS X, etc, etc.  Since you have used it, I was hoping you could give some specifics about the situations you've had trouble with.  Telling someone "It runs like crap on this computer" without giving the system specs for that computer doesn't tell us a whole lot.  XP would run like crap on a system from 1995 (yes, I have heard of people trying to do this).  You've given partial specs, but not really the full thing.  At this point though, that's kind of irrelevant.

I simply do not like hearing people say how crappy something is and then later finding out that they were trying it on totally outdated hardware.  Sure, XP runs on it fine because it was released in 2001.  Vista requires a little more oomph.  Since you've obviously seen it run on much more modern hardware, then I'd have to assume there was a driver problem somewhere.  That is usually where the typical problem is.  People tend to not install the latest drivers (or even any drivers) for all their stuff.  They install Vista, it comes up and works (but it's slow), so they complain.

Having problems with UAC is a little strange.  For years people (not necessarily you) bitched about how insecure Windows was.  Now MS has added a layer of security and people bitch about how much of a pain it is.  None of this is actually Microsoft's fault.  For years they tried to convince developers to program their software "the right way".  For years, developers flat out ignored best practices because people could simply be made Local Admins.  Microsoft finally forced the issue with UAC.  Now developers are having to do it the right way or their program won't work or it'll launch a UAC prompt.  So finally, after many years of telling developers the right way to code for Windows, they're doing it in order to avoid the UAC prompt.  This, in my mind, is a Good Thing.

Are you trying to tell me, lordjedi, that you have absolutely no problems with Microsoft whatsoever or in any way? Are you telling me it's unfair for laypeople like me to have opinions based on my own experiences? I've built and set up many computers and worked with a lot of different hardware and different versions of Windows for many years and I don't think I'm an idiot. I could be honestly wrong in many ways (I'm no coding expert), and certainly idiotic in some small ways here, but you went way overboard in your previous posts. I really have trouble believing that you think Microsoft is so perfect and so justified in everything that it has done in the free market that you have absolutely no problems with them whatsoever. If that's not the case, however, and you actualkly do think Microsoft could be better in some ways yourself, then why are you being so extreme in reprimanding someone like me who happens to have some problems of my own (that are based upon my own point of view)?

Actually, I don't really have any problems with Microsoft, no.  The times when my own computer has had problems, it hasn't been related to Windows.  The one time when I nearly lost a lot of data, it was due to changing motherboards and then not reinstalling Windows with the different drivers.  I usually have more trouble with people (not necessarily you) that want to throw everything and the kitchen sink onto a single server and then bitch because it's slow or something else crashes from time to time.  I'm only just now being allowed to offload certain services to other servers because I and other consultants can tell my boss "the server is overloaded".

I have seen problems with Vista and XP.  But those problems are not because of Vista or XP.  Those problems are almost always (99%) driver related.  I've seen hardware fail and people immediately blame Microsoft and Windows.  How it was their fault that a hard drive died is beyond me.  I too have built and used a lot of different hardware.  I see plenty of hardware every day.  I get asked by coworkers about systems all the time.  I know what cheap hardware is, which is why if someone's having a problem, I ask them for their specs and I ask who made the hardware.  Invariably, the hardware is just cheap crap.  I tell them that if they'd just spend an extra $100 (total), that they wouldn't be having all the problems they're having.

The times when I've had lots of trouble with a computer, it was in fact the hardware.  I have no doubt that your experience may be different.  For me, every single time I've had trouble, it was not a problem with the software.

As for their business practices I'm quite happy that they actually layout a full support and availability timeline from the moment a product ships.  That let's me plan for upgrades and purchases.  The only thing I haven't been happy about is "Software Assurance", but that's only because they didn't release timely updates when they first introduced it.  If I did upgrades on a regular basis, Software Assurance would save me tons of money at work.  Since we only upgrade "when we need it", Software Assurance actually costs me more, so we don't do it.  We had one recent upgrade that cost us a ton because we didn't keep up with the software, but that wasn't from Microsoft, so I can't really get mad at them.

I was more pissed at Dell when Vista was made available because Dell immediately cut XP from their product line with no order from Microsoft.  It took a couple of months, but they did start reintroducing XP into their lineup and it wasn't until later that they had to get ready to cut XP because MS announced the end of availability (which was actually extended at least once).

I think I actually get more angry at cell phone companies.  They seem to have a 6 month product cycle.  So the phone you buy today probably won't be available a year from now.  Talk about an industry that's constantly pushing the "latest and greatest".

 

If we could discuss the issues I want to discuss, that would make me happiest here. For now, though, I have run out of time and must be moving on. You actually touched on some more substantive points further down in your post and I'll want to discuss them later. For now, though, if you want to reply to the useful parts of our discussion, feel free and I'll try to address them later as well.

I will do my best to not include comments like "Most people say that but have no experience with it".

Post
#337771
Topic
Abrams is Destroying Star Trek like Lucas has Destroyed Star Wars
Time
doubleofive said:

If this movie is successful, I honestly believe people will give old Trek a chance again, not dilute the franchise or anything so silly.

Ha!  This just floors me.  If this movie is successful, why on earth would they even go back to old Trek?  They'll take one look at it and thumb their noses, just like they do right now.

But some of us are excited to be able to talk with their friends about Star Trek without looking like anti-social outcasts.  Star Trek either goes mainstream, or it withers and dies like it has been for years.  I'd rather it be mainstream.

If people treat you like an outcast because of the movies you like, then they're not worth talking to.  Seriously, depending on what you're discussing, there's no reason for that kind of shit.  Seeing the latest Trek flick with cool explosions and hot chicks in skimpy outfits isn't my idea of seeing a Trek film.

So yes, I'd rather see Trek die than see it go mainstream in order to capture an audience that isn't going to give a shit about it a year or 5 years from now.  I'd rather see it die than see it become the "blockbuster of the week".  Like C3PX said, it's about dying with dignity.

I'm sure it'll be successful though, based solely on the guy playing Kirk.  He'll bring in all the tween and teenage girls, just like the guy in Twilight.  And the young sexy chicks will bring in all the jocks.  So it'll definitely go mainstream, but conversations will be limited to "OMG! He's so hot!" instead of "I wonder how this is going to play out in the next movie?"

 

Post
#337770
Topic
Lord of the Rings on Blu Ray
Time
zombie84 said:Hmm, yet they still do it. I wonder why that is? Let me guess, they are all old luddites afraid of change who cling on to what they are familiar with. Yeah, sure.

I know plenty of photographers.  The ones that still use traditional film do it because even though it costs them more, they view photography as an art form.  They actually don't like the ability to take 100s of pictures on a single shoot, find that killer shot, and then discard the rest.  They like to take their time and wait for that one killer shot.  They pay for it too.  They have to conserve their film and they have to pay all the costs for processing and storage (they don't throw bad shots away).  Most of them don't do it for anything more than a full time hobby any more since they can't compete with the photographers who are using digital and don't have the same costs they do.  This is what I've heard from photographers and read in photography magazines.

But this is the fundamental part of the issue you're apparently not quite understanding--film LOOKS different.

But that's not at all what I said.  What I said was that they like the way film feels.  What I meant was that they do in fact like going through all this trouble because they feel more connected to the images they're taking.  Unlike the person who'll just shoot and shoot and shoot hundreds of photos and not worry about running out of space on the memory card, the person shooting on film has to stop and think about exactly what they're doing.  They have to consider how much film they have left, is this really a good shot, should I wait and maybe get a better one.  That's not to say the digital photographers are just shooting willy nilly hoping to land a shot, but the film photographer has to take a lot more into consideration.

I've seen this myself when I've been out with my wife and son.  I'll take a bunch of photos, say 5 to 10, of them in one spot.  I get home and it turns out that maybe 2 of them are good shots.  If I was still using film, I'd probably take one or two pictures and just deal with the results (which would hopefully be decent).  In a given day, using film, I'd say I end up with maybe 5 good shots.  Using digital, I usually end up with 20 or so.  If I were a professional photographer, having an increase like that would be a godsend.

That is why it's referred to as a dying art form.  The majority of photographers are not willing to spend that much time thinking about what they're shooting.  They just want the shot.  The quicker they can get it, the better.

HD was made for news. AOTC was shot using a news camera that had a cine lens frankensteined onto it.

Pardon me for saying this, but AOTC is a horrible example to use.  Hell, any Star Wars prequel is a horrible example to use.  I 100% agree that those movies should not have been shot with "HD cams" simply because they'll never look any better than they do.  They are maxed out right now at 1080p.  If there's something better 10 or 20 years from now, they'll never look any better.  At least with 35mm, they can take the raw 4k scan and give us a 2k HD video (if something like that comes along).  That'll never be available for the prequels.

Again--resolution is not what I'm talking about. Resolution is the least of the issues. AOTC looks like shit not because its 1080p, but because of the way the digital sensor captures the image. The quality--the characteristics--are ugly. There's no black detail, shadows break up and even show digital artifacts, theres really high depth of field, the edges are really sharp and harsh, everythings way too crisp (despite the low resolution), colors bleed, theres not a very nice pallete, theres noise galore, especially in dark scenes, and everything simply looks mushy and gross. This has nothing to do with resolution. And much of these issues continue to this day. THATS why most photographers refuse to go digital, THATS why, when you are spending millions of dollars on a production you shoot on a chemical emulsion.

But doesn't some of that have more to do with the color correction and other things that were done after it was shot?  We know the 04 DVDs had all the whites turned to blue, so who's to say that the prequels didn't have reds and blues totally blown out at certain points, making it look like shit.

Obviously the cameras weren't ready for primetime either.  I think everybody, including Lucas, knew that.  Lucas, being the hard head he is, just didn't want to admit it.  After all, he was going to usher in a new era of filmmaking, just like he did with the OT.

Post
#337695
Topic
Lord of the Rings on Blu Ray
Time
zombie84 said:

But this notion that grain is inherantly bad is not only wrong, it completely misunderstands the argument in the first place.

And I'm not saying it's bad.  It's a part of the medium.  I get it.  I totally do.  What I was asking, and you answered, is that if you could have the image without the grain, would you take it?  Apparently some DPs wouldn't.  There's nothing wrong with that.  But you also stated that DPs can choose film stock that has a very fine grain.  So by extension, if they needed as little grain in the image as possible, would they not choose a medium that provided no grain if it was available?  Since they can have difference levels of grain in the same movie, I can imagine that if they wanted something with nearly no grain, they would take it.  And if something was available that gave them the same image without the grain and they didn't want any grain, why wouldn't they take it?

Photographers use chemical emulsions because that was the only way to do it for 100 years.

Hmm, yet they still do it. I wonder why that is? Let me guess, they are all old luddites afraid of change who cling on to what they are familiar with. Yeah, sure.

I know plenty of photographers.  The ones that still use traditional film do it because even though it costs them more, they view photography as an art form.  They actually don't like the ability to take 100s of pictures on a single shoot, find that killer shot, and then discard the rest.  They like to take their time and wait for that one killer shot.  They pay for it too.  They have to conserve their film and they have to pay all the costs for processing and storage (they don't throw bad shots away).  Most of them don't do it for anything more than a full time hobby any more since they can't compete with the photographers who are using digital and don't have the same costs they do.  This is what I've heard from photographers and read in photography magazines.

Digital photography is nearly indistinguishable from film, so many photographers have switched to digital.

WRONG!! Maybe to the layman it is.

No, this comes from professional photographers that I know.  As of about 5 megapixels, to the naked eye, they could not tell the difference of an 8x10 35 mm shot vs an 8x10 digital shot.  They had to use a loop to see the differences.  That was 6 or so years ago when I confronted one about his love for digital photography.  At the time, there was a slight split between him and other people he knew between film and digital.  Film was preferred by some simply because they felt there was more detail, even though it was imperceptible.  He had gone straight digital since he had seen side by side comparisons of digital shots and 35mm.  They were indistinguishable to him and many others that looked at them.

 

If a piece of art gets people talking and looks really good, why does it matter if it was done in Photoshop or with brush strokes?

Because if you see the real Starry Night by Van Gough you'd know that what you get in photoshop doesn't even begin to approximate that. Its quite laughable. But hell, using Van Gough is a pretty bold example, go down to an art college and check out any random painting. The photoshop plug-ins weren't meant to replace brush and pigment. They are there for graphic design, for a cheap, quick disposable way of getting "the idea" across that its supposed to be brush and pigment. For photoshop, yeah, its pretty good. For photoshop.

So I'm suppose to judge how good art is by the brush strokes in the painting and not by how it looks?  That sounds kind of lame and elitist to me.

HD was made for news. AOTC was shot using a news camera that had a cine lens frankensteined onto it.

Pardon me for saying this, but AOTC is a horrible example to use.  Hell, any Star Wars prequel is a horrible example to use.  I 100% agree that those movies should not have been shot with "HD cams" simply because they'll never look any better than they do.  They are maxed out right now at 1080p.  If there's something better 10 or 20 years from now, they'll never look any better.  At least with 35mm, they can take the raw 4k scan and give us a 2k HD video (if something like that comes along).  That'll never be available for the prequels.

I know all this because I work in the biz. I'm part of the International Cinematographer's Guild and have seen the key years from 2005-2008 when people outside of George Lucas and Robert Rodriguez actually started adopting HD.

One day, HD will solve MOST of its problems. Maybe by 2020. Maybe not. People have already been brainwashed that "HD=good" so it'll happen unless people start smartening up. You see it in the names of products--they'll throw "HD" in the product label to subconsciously make you think its better, even when it has nothing to do with digital video display (hell, everything from printers to blenders to exercise equipment). And they've already been partially succcessful in brainwashing people that the inherant characteristics of HD--clear images with sharp edges--are good; "look at how CLEAN this is! Its not dirty! Its really SHARP! You can SEE everything really clear!" A lot of this has to do with the parallel explosion in high-res video games. So I guess the public consciousness of the layperson is shifting. They want movies to look like their X-Box and they want their expensive plasma TV's that they paid out of their ass for to display like you are looking into a mirror. The film industry is driven by different pressures than other fine art mediums. But most cinematographers have been doing their best to resist this, to slow down the process and, hopefully, drive high-def video into a more acceptable aesthetic before its ready for wide professional adoption. For the average guy, the ease and versatility of video is a welcome trade off, but there's a reason why, when you are spending millions and millions of dollars to produce a film, photographers are deliberately going with chemical emulsions.

I agree with probably all of this.  It's even happened with radio.  "Listen to us on HD radio 101.1!".  The HD doesn't even stand for hi-def, but people think it does.

IMHO, HD for hi-def is good.  The more resolution you can get on a screen, the better.  If I end up being able to see grain, I'm ok with that.  As long as what's there is what's suppose to be there, I have no problem with it.  Unfortunately, I'm probably in the minority and this isn't in the same league as the "widescreen problem".  You're right that people want a crisp and clear image.  Knowing that, it's going to be nearly impossible to convince people that the grain is suppose to be there without going through everything you've mentioned here.  People are going to see the "flaws" and get angry.

 

Post
#337669
Topic
Abrams is Destroying Star Trek like Lucas has Destroyed Star Wars
Time
skyjedi2005 said:

Even if the tos fans hate this movie they pretty much have to embrace it.  Bring their friends and family to see it multiple times.  It is the only way new trek will happen.  If this thing Bombs Paramount would be well within its rights to never spend a dime on the series again.

Then so be it.  If this turns out to be a terrible Trek, I'd rather see it bomb than see it be successful and watch Trek turn into yet another teeny bopper flick.  Trek is better than that.  Trek's success should not hinge on whether the "tween" crowd wants to see this or not.  It should hinge on it being a good story or not.

Johnny Ringo said:

skyjedi2005

 

 

A crowd that doesn't normally give a crap about Trek anyway.  So they're going to insert lots of action and some sex into a Trek film.  It'll be yet another blockbuster of the week that everyone will forget about a month after it's released.

 

Post
#337606
Topic
Lord of the Rings on Blu Ray
Time
Gaffer Tape said:
lordjedi said:

I agree with this 100%.  I still have trouble convincing people that the "black bars" are suppose to be there.  And now the same thing is happening with grain.  If it's suppose to be there, then I have no problem with it.  Hell, I use to be one of those uneducated consumers that didn't know about widescreen.  Once I did find out though, that's all I wanted.  Once I learned the difference between "widescreen" and anamorphic widescreen, the black bars didn't bother me at all (when I first same them on my widescreen TV, I was pissed).

Just to make sure we're on the same page, I'm not as concerned about widescreen itself as I am about original aspect ratios.  It's taken nearly two decades of home video (and decades before that of television broadcasts) of chopping up movie frames to holy hell before people finally started to get educated.  The only problem is, that now that widescreen TVs are becoming the norm, the opposite problem is happening with the uninformed consumer:  television shows and movies are being cropped to fit this new wide television screen without pillarboxing.  They did it on those crappy DBZ season box sets, and they've done it on a few Disney movies.  It almost seems like we got across the wrong message.  Widescreen's suddenly the new "thing," so everybody wants it wide, regardless of how it's supposed to be.  Well, that, and the same people who complained about horizontal black bars and never learned any better are now complaining about vertical black bars and still probably won't know any better.  It's cringeworthy when people stretch out a 4:3 image to fit a 16:9 screen.  I saw my roommate go above and beyond that.  He was watching a dual-sided DVD.  One side was 4:3, and the other was widescreen.  He was watching the pan and scan version stretched out to widescreen.  I think a few synapses in my brain blew out when I realized the total lack of logic in that.

Haha.  That is pretty strange.  If given the choice, I'll always take the wide aspect version.  Unfortunately, I can't get all my channels in a wide format, so I'm forced to watch 4:3 ratio programming stretched to 16:9.  It's either that or I end up with burn in on the sides (my tv had burn in marks that I could see, so I stopped watching it like that).  But otherwise I agree that watching a 4:3 program stretched out is cringeworthy.

I haven't seen any TV shows that don't use pillarboxing where appropriate.  Most of the local HD news channels use it, they just don't use black (usually some kind of swirly blue color).  As for other tv shows, I'm assuming they're filming them in wide aspect now.  I know the last few seasons of Buffy the Vampire Slayer were filmed in widescreen, so they match the aspect of a wide tv.  I'm assuming Smallville is filmed the same way now since they also match the aspect of a wide tv (at least the HD channel does).

 

Post
#337576
Topic
STAR WARS: EP IV 2004 <strong>REVISITED</strong> ADYWAN *<em>1080p HD VERSION NOW IN PRODUCTION</em>
Time
doubleofive said:
jlw515 said:

Ok I still cannot find the link anymore on this site http://fanedit.org/517/ to download the torrents. Is it still there and I am just missing it, or did someone take it down? I'd like to have DVD-9 with the deleted scenes, but I only have 10 or the 80 files downloaded. Any help here would be hot.

Jason

The MPAA took it down.  The whole site.  You're on your own on finding the rest of the rapidshare links.  Someone might have them handy, but there is no easy way anymore.

FE.org isn't down.  The links have all been removed.  Try contacting Adywan directly via email or PM.  He might be able to hook you up.

 

Post
#337575
Topic
Lord of the Rings on Blu Ray
Time
Gaffer Tape said:I doubt I have anything new to add at the moment, so I guess I'll just lay my own opinions on the line.  For me, film preservation is a no-brainer.  Yes, keep the grain!  No, don't colorize!  No, don't try to force something made decades ago to fit current, popular aesthetics.  Don't alter aspect ratios to fit the sizes of TVs, whether that ratio is the 4:3 of SD TVs or the 16:9 of new TVs.  It's all ridiculous and is going to continually be a bone of contention between people who know better and the "average consumer."  And, for some reason, the average consumer is never going to be convinced to be educated.

I agree with this 100%.  I still have trouble convincing people that the "black bars" are suppose to be there.  And now the same thing is happening with grain.  If it's suppose to be there, then I have no problem with it.  Hell, I use to be one of those uneducated consumers that didn't know about widescreen.  Once I did find out though, that's all I wanted.  Once I learned the difference between "widescreen" and anamorphic widescreen, the black bars didn't bother me at all (when I first same them on my widescreen TV, I was pissed).

It took years to educate the public about how much better widescreen is.  It's going to take years and many a screwed up release to educate them that the grain they're seeing is suppose to be there and isn't dirt.

As for the succession of technology, I consider that much more of a gray area.  I've only ever done digital editing.  I'd love the chance to try out editing on actual film.  However, part of me wonders, like lordjedi, if the trade-off does allow for comparable results with much less fuss.  It seems, especially with this digital shift in medium, that there is always some sort of trade-off, and it usually seems to be quality for ease.  Some people see it as being worth it.  Others don't.  I'm on the fence.  However, it seems that most people here agree that, ultimately, technology will win out, for better or for worse.  If that is the case, and digital is ever able to provide a comparable image, I suppose it will have to come down to aesthetics, where digital provides a certain-looking image while film provides another style.

I've spliced home videos together.  Personally, I don't ever want to have to do it again.  I was only repairing a broken real, but it was a pain in the ass.  I've spliced VHS movies together too, same pain in the ass.  I would much rather do digital editing than have to painstakingly edit together reals of film into a usable product.  That's mostly because I can see the actual results as I'm working and it's much quicker to zoom in on a frame of video and find the scene change.  I can't imagine splicing film together though, that would be a nightmare (to me) for anything more than a simple repair.

Post
#337561
Topic
Lord of the Rings on Blu Ray
Time
zombie84 said:
lordjedi said:
C3PX said:

If the use of film were to come to an end, it would be the loss of an art form.

I kind of see it the opposite way.  Digital will get cheaper and cheaper while film will just get more expensive.  So fewer people might end up using film, but it'll still be there as an art form.

Similarly, with digital photography you no longer have to worry about having enough film to get that "killer shot".  As long as you have enough space on the memory card, you can take pictures to your hearts content.  When you get home to your "digital darkroom" you can then pick out that one great shot out of the hundreds of photos you took.  Maybe you didn't get the ISO right or maybe the exposure time was to long on one shot.  As long as it was right on the one shot out of 20, then you still got your shot.  And of course it's also possible to "make" that one great shot if one photo is close but not quite right.  The difference is that instead of it costing you 80% of a roll of film (assuming 24 frames and math I don't feel like doing) it didn't cost you anything for the digital shot.

Of course, there are still people who would prefer to wait and try to get that perfect shot.  To them I say good luck.  It'll be a little more expensive for them, but if they enjoy it, then so be it.

 

In photoshop there is an acrylic paintbrush option. It looks the same as a real acrylic brush stroke, you can edit the parameters including strength and brush coarseness, and you have much more precise color mixing options. You can erase and re-paint at the stroke of a mouse click and more importantly you don't have to buy acrylic paints, which are expensive, spend time mixing palettes, use easels and canvases which are bulky, space consuming and cost money, and you dont have to buy fancy brushes with specially made hairs. Plus you have digital filters and plug-ins and the ability to have unlimited image manipulation in the digital realm.

But would you want all fine art made in photoshop?

If the emulation in Photoshop is as good as it sounds, I challenge anyone to tell the difference anyway.

With that said, Adobe must have had some kind of demand for a feature like that.  Otherwise, it wouldn't make sense to implement it.  Something tells me that the "purists" didn't want it, but the aspiring artists and others did (this is based on nothing more than the comments here).  If it's indistinguishable from the real thing, why does it matter what tool (that's all Photoshop is, a tool) was used to make it?

Film is the exact same. Anyone who tries to argue differently either doesn't actually  understand the art of photography or doesn't care in the first place. To those people, digital is an efficient trade off in speed and quality, but for people that care about the art its not a replacement. Audiophiles listen to vinyl records, cinematographers shoot n 35mm film, photographers use chemical emulsions in whatever format and painters use oil or water based pigments on a physical surface. Digital emulation is not a replacement for any of the above, not yet and not ever.

Audiophiles listen to vinyl not because it's an art form, but because it more accurately reproduces the sound (or so they say).  Sure, you can reproduce pops and clicks in a digital file, but audiophiles don't listen to vinyl to hear the pops and clicks (those are a side effect of the media).

Cinematographers shoot in 35mm because it has a much higher resolution than digital (for the forseeable future).  I'm sure they also choose their film based on how much grain is there too, but somehow I doubt they'd mind if they could get the same thing with no grain.  If they can indeed choose film that has as little grain as possible, then why wouldn't they choose a medium that could reach the same resolution with no grain at all?

Photographers use chemical emulsions because that was the only way to do it for 100 years.  Digital photography is nearly indistinguishable from film, so many photographers have switched to digital.  They save money and they save time and they're getting the same results.

If a piece of art gets people talking and looks really good, why does it matter if it was done in Photoshop or with brush strokes?  And if you can't tell the difference, why does it matter even more?  Isn't the point of art to express yourself?  If Photoshop lets someone do that for 1/10 the price, what's the problem?  Why does the medium matter at all?

You guys are starting to sound like the people that didn't want sound in movies or didn't want to see color when it arrived.  Both of those ushered in changes to the way movies were made and how things were done.  Digital is no different.  Just because there's no grain does not mean that film making becomes any less of an art.  You'll still have plenty of shit movies, the only thing that will change is the medium.

I've yet to read a convincing argument why digital is somehow worse than film other than the available resolution.  A shitty movie is going to be a shitty movie no matter what it's shot on.  But a good movie, a really good movie, is going to be good whether it has grain or not.  Lack of grain isn't somehow going to make an otherwise stellar movie into a piece of shit.

 

Post
#337516
Topic
Lord of the Rings on Blu Ray
Time
C3PX said:

If the use of film were to come to an end, it would be the loss of an art form.

I kind of see it the opposite way.  Digital will get cheaper and cheaper while film will just get more expensive.  So fewer people might end up using film, but it'll still be there as an art form.

Similarly, with digital photography you no longer have to worry about having enough film to get that "killer shot".  As long as you have enough space on the memory card, you can take pictures to your hearts content.  When you get home to your "digital darkroom" you can then pick out that one great shot out of the hundreds of photos you took.  Maybe you didn't get the ISO right or maybe the exposure time was to long on one shot.  As long as it was right on the one shot out of 20, then you still got your shot.  And of course it's also possible to "make" that one great shot if one photo is close but not quite right.  The difference is that instead of it costing you 80% of a roll of film (assuming 24 frames and math I don't feel like doing) it didn't cost you anything for the digital shot.

Of course, there are still people who would prefer to wait and try to get that perfect shot.  To them I say good luck.  It'll be a little more expensive for them, but if they enjoy it, then so be it.

 

Post
#337480
Topic
Lord of the Rings on Blu Ray
Time
ChainsawAsh said:

If this becomes cost-effective, and editing setups can online this massive resolution of footage, then I, with extreme sadness, predict the death of film by the end of the next decade (that is, 2020).  It will happen the same way film editing switched to digital.  Once it becomes cost-effective and the quality debate is negligible, that's the end.

Is that a bad thing?  I'm not a videophile and I'm certainly no expert, but if the difference in quality becomes negligible, I certainly don't see that as a bad thing.  Obviously there won't be any grain after that, but I don't think I'd even call that a negative for digital.  Since grain is just the way film is, then the death of film would mean the end of grain, but I don't necessarily see that as a bad thing.

So, as a serious question, is the death of 35mm a bad thing and why?

 

Post
#337241
Topic
Abrams is Destroying Star Trek like Lucas has Destroyed Star Wars
Time
Gaffer Tape said:
skyjedi2005 said:

They have a lot to prove since these guys wrote the awful script for Transformers. Bayformers as it is now called by its haters.  That was a good popcorn flick but raped its source material i hope the same is not done here.

I grew up with and loved Transformers myself, but I gotta call this into question, sky.  I admit I haven't seen the whole of the new movie, but I have seen parts of it.  And I have heard quite a bit of people saying it didn't respect the source material.  But that's never really bothered me too much.  Because, honestly, how much serious respect can you give the original cartoon knowing that it only existed to promote a line of action figures?

Some friends and I just tried watching Transformers on Blu-ray this past weekend.  At first we started it and let it play.  Then we started skipping to the action scenes.  Then we just flat out turned it off.  Every couple of minutes, even during the action scenes, we found ourselves saying how lame it was.  From the Autobots transforming into cars on the guys lawn to the 10 min travel time from LA to Vegas, it's just aweful.

The first time through the movie was just alright.  It has some good action scenes in the beginning, but even there we had to look past that lame things (like the unauthorized vehicle actually landing at a military base in a war zone).  Each subsequent viewing gets worse.

 

Post
#337209
Topic
HowTo: Put Wookies into Return of the Jedi.
Time
TheoOdo said:

Speaking of volunteers, whole new scenes could be easily created with the assistance of the 501st Legion and their rebel equivalent. Costuming Stormtroopers and Rebels wouldn't be a problem, probably wouldn't even cost a penny. Locations would naturally be any forest that's available but preferably one of the Redwood National and State Parks. The only difficulty here would be gaining permission to shoot, but this is, again, possible.

Good luck getting the 501st or Rebel Legion involved.  If it even smells like it's going to make Lucas mad, they won't even discuss it.  You'd have much better luck getting a bunch of random costumers together.  You'd basically be asking them to reshoot entire portions of Jedi so you can make a fan edit that has Wookiees instead of Ewoks.  They don't usually look to kindly on things like that.

If anybody from the 501st or RL did participate, they'd be risking any kind of reputation they've built up with those organizations and LFL.  If something went bad, they'd either be excluded from future events for some time to come or they'd be kicked out entirely.

They may be a bunch of elitists (personal opinion), but they have very good standing with LFL and they aren't going to do anything to jeopardize that.

 

Post
#337198
Topic
Abrams is Destroying Star Trek like Lucas has Destroyed Star Wars
Time
Hunter6 said:
Count Dushku said:

I need to make a comment before I log back out for a real long time.

The reason why Star Trek began to suck began with TNG. Some hack job liberal writers got together and began to spew their evangelical message with all the fervor of fundamentalists. Sure, Rick Berman has his share of the blame, but the writers were the ones who ultimately burned the ship. It got to be too preachy.

Rick Berman is not a liberal nor were the later-on writers like Brannon Braga.

Both Rick Berman and Brannon Braga are now working on 24 which is a republican show.

Brannon Braga was the worst thing to star trek and I see it as Brannon Braga as the real killer of star trek, not Rick Berman.

The reason why Star Trek began to suck began with TNG is because the real liberal view point which Gene started on the show, died with him.

Gene didn't die until after Season 5 of TNG.  There's a noticeable difference between seasons 1-5 and anything after.  It's quite obvious the changes came about due to Roddenberry's death.  To me, TNG really started to suck after Season 5.  Sure, there were occasional good episodes, but for the most part it was all downhill after Roddenberry's death.

 

Post
#337146
Topic
Blu-ray prices not coming down
Time
C3PX said:

Most cell phone companies will give you a phone when you sign their contract, either way you still have to pay the monthly fees, so why not switch to a smaller, sleeker, more practical device when given the chance.

Just so both of you know, cell phones are a horrible example to use.  You get the phone for "free" because the monthly bill is subsidizing the cost of the phone.  That's why there's such a hefty fee to cancel your plan before the contract is up.  You're essentially paying for the phone at that point.

Many companies will give you some kind of discount when upgrading to a new phone, but you likely won't get a "free" phone without switching to a different company altogether.  Until recently, cell phone numbers weren't portable, so switching companies wasn't practical.

That's just my two cents about that example.  Feel free to carry on :)

 

Post
#337144
Topic
Abrams is Destroying Star Trek like Lucas has Destroyed Star Wars
Time
Count Dushku said:

I don't give a damn if most of the Trekkers don't like the new Star Trek. Go watch your TNG, you who were in your twenties and thirties when TNG came on. Give those of us who grew up with Star Trek from a young age something cool again.

Grew up with Star Trek?  It ran for a total of 3 seasons.  Unless you were 10 when it aired, you didn't grow up with Star Trek.

I was 14 or 15 when TNG came out.  That was after no Star Trek for many years.  But even I saw and enjoyed TOS.  Many of us don't need flashy effects and huge explosions to enjoy Star Trek.  The dream of traveling to far away places in deep space was enough to get some of us totally hooked.

All Good Things certainly seemed to have a moral message to me.  That message was "be careful what you wish for".  Isn't that the one where Picard had wished that he hadn't gotten involved with that fight?  And by not getting involved with that fight, he ended up in Engineering instead of becoming captain of a starship.  He took the "safe" route, so he was never considered for any positions that would involve taking risks.  Maybe it wasn't a "save the whales" message like Star Trek IV, but it certainly had a message to it.

 

Post
#337135
Topic
Blu-ray prices not coming down
Time
C3PX said:

I thought it was cool, but I didn't feel it was worth the money, especially since I already owned a good sized library of VHS tapes.

Know what had me sold?  This was in 1999.  On one of my first DVDs (it might have been my first one), the first thing I did was pause the movie during a scene.  I was treated to a perfectly still, crystal clear image.  Contrast that with the standard VHS "shakey" pause.  I was instantly sold.  After that, scene selection, the extra features, every movie was available in widescreen (only later did I find out that Anamorphic was a different animal).

All of those things had me hooked on DVD from the moment I hit play.  I actually didn't invest heavily into VHS because movies were rarely available in widescreen.

 

Post
#337057
Topic
Windows 7
Time

Tiptup said:
lordjedi said:

Your first comment is totally inaccurate.  Read what I wrote again.  DX9 gives access to the same effects, the difference is that DX10 makes those effects easier to do.

I've been researching parts for a new computer the last few months and one of the reasons I was planning on getting Vista was because I've read that DirectX 10 would be supporting newer, hardware-based effects that XP won't have access to without updates to DirectX 9. Here's one of the effects that I was led to believe this about:

http://en.wikipedia.org/wiki/Geometry_shader

Now, maybe you're just a lot smarter than all of the news and information sources I've been looking at, and if that's the case then I'm obviously misinformed, but last I checked you won't be getting support for that effect XP. If all of the sources I looked at are correct, then considering how Microsoft was actually still selling XP when they released their support for "geometry" shading, I don't see why XP couldn't have been given support for that effect too.

Sorry, I guess I simplified it to much.  I didn't mean that DX10 wouldn't have any new effects that were unavailable in DX9, what I meant is that some of the effects in DX9 are just like the ones in DX10, but DX10 makes them easier to code.  So of course DX10 will have newer effects, just like DX9 had newer effects than DX8, and DX8 newer than 7, so on and so forth.

Just because they were still selling XP does not mean they were doing code for it (aside from patches and security updates).  Even XP SP3 just wrapped all the updates since SP2 and added a few networking enhancements from Vista into one package.  The point is that DX10 accesses display drivers in a completely different way which is incompatible with XPs driver model.  That is why DX10 is not available for XP.

Unless you're a gamer, DX10 isn't going to mean a whole lot to you.  Yes, it's faster and better than DX9, but I don't see that having a detrimental effect unless you're using your computer to play games.

lordjedi said:

 

You may think backporting DX10 to XP isn't that hard, but you also don't know the code.  I've seen whitepapers from MS that show the difference between the driver models in XP and Vista.



Why do you keep talking about me as if I want DirectX 10 put into XP? I've already said that if Microsoft did a lot of work on a newer version of DirectX that I'm fine with them keeping it native to Vista only. This line of yours is really starting to bug me. I hate it when people don't read what I'm saying.

I just want the same support for the newer hardware effects since I think XP is a superior OS. If that desire of mine is misinformed (and XP already supports every possible effect that Vista will support), and you can prove that, then I am corrected and we can move on. Until then, I'm going to think this is an easy way Microsoft could continue supporting XP (and
should if they want happy customers).

OK.  Then I'll put it this way.  MS did put a lot of work into DX10.  Just because it doesn't look like it to you and you don't know what's going on behind the scenes, doesn't mean they didn't put a lot of effort into it.

You're free to think XP is a superior OS.  You're wrong of course, but you're free to have that opinion.  Vista has improved support for multi-core CPUs, much better memory handling, and much better support for games.  And that's just the beginning of the improvements.  I've seen XP and Vista on the same modern hardware (Core 2 Duo, 2 GB RAM, built-in video) and Vista was noticeably faster.  That was even before SP1 for Vista came out so I have no doubt that Vista is even faster with SP1.  On the same hardware, I was able to leave all of Vista's flashy effects turned on and not feel like the system was crawling.  When I do the same on XP, I always want to turn the effects off because I feel like the system is slowing way down.


 

lordjedi said:

You expect more support?  Hey genius, try going to Apple and getting support on OS 9.  I bet they don't do it.  Getting support from MS for XP is the same thing.  It's an outdated OS that has run its course.



I don't "expect" more support for XP; I
want more support for XP. If Microsoft wants me to be a happy customer (which is up to me to decide in a truly free market), then it would be wise for them to give Windows XP a little more support.

Microsoft's "support" for XP hasn't ended.  The only thing that ended is retail availability:

http://www.microsoft.com/windows/lifecycle/default.mspx


And, hey, genius, in your opinion XP is an outdated OS that has run its course, but I'm a different person and my opinion can be different. The best way to deal with different opinions, from people like me, is to discuss them rationally and not say the same thing over and over.

That's not my opinion though.  That's Microsoft's stated fact.  XP is outdated and has run its course.  Here's their support timeline:

http://support.microsoft.com/lifecycle/?p1=3223

As you can see, Support is still available for XP.  Just because you can't buy XP from a store (retail availability) or from MS, doesn't mean it isn't supported anymore.

Seriously, lj, do you think game manufacturers are stupid chumps for supporting their games with patches many years after they come out? Is Blizzard a stupid company for still upgrading StarCraft after practically ten years? The kinds of small support I'm asking for aren't extreme. The free market has space for many different ideas of software support (assuming the free market is functioning) and for you to demonize me for wanting more support is getting really silly. (You're giving me a headache.)

You're right, the free market does have many ideas of software support.  Which is why there are companies out there still support NT, even though Microsoft doesn't offer support.

To answer your question, without knowing what that patch fixed, I couldn't say if they're stupid or not.  If that patch only took a few man hours to work on, then no, they probably aren't stupid.  But if it took several weeks to do, then yes, I'd wonder why they worked on it.  With the success of WoW, I wouldn't understand them putting any extended effort into any of their legacy products.

The funny thing is, you can't BUY Starcraft from them without getting it as a digital download.  So why don't you go try to buy a 10 year old game (aside from a digital download) and see how easy it is?  I'm sure you could get a used copy, just like you can get a used copy of XP, but I seriously doubt you can find a new unopened box anywhere.

lordjedi said:
A few years ago, you were probably bitching that Vista still wasn't out and that XP was getting old.

 

No, actually, I wasn't, you obnoxious fruitcake. Where is there any evidence for you to go off assuming something like that about me? XP is perhaps the best version of Windows I have ever used and I was one of the people who bought it on the day it was released. (XP has always been a fantastic product to me.) The few times I've used Vista I've found it to be a piece of shit by comparison. Beyond the fact that it is clearly less stable (I had no major errors or restarts with XP from the very start), having to tell it that I want to wipe my ass all the time (or be bothered by a security message every two seconds) is absurd. I just don't like it at all and I don't see who you are to fucking reprimand me for making that personal judgment. (Only lordjedi's personal judgments of what's desirable or undesirable are allowed in this world?)

 

The comment was made because most of the people that bitch about Vista were the same people bitching about XP when it was first released and now they're professing how great XP is in comparison.  I remember the comments quite well.  XP was trash and 2000 was the best OS ever released.

Seriously, I don't care if you have a love affair with Vista. So, why, then, do you feel so keen on lecturing others for not liking Vista? What on earth is making your blood boil so much with this issue? (I have no fucking desire to have a heated debate about Windows for crying out loud.)

I haven't lectured anyone for not liking Vista.  I've asked people what programs they had trouble with.  I've explained that MS hasn't changed their support timeline or the retail availability timeline (aside from extending them) since they released XP.  What I don't like is people bitching about how MS is forcing them to get a newer version.  What I don't like is people saying how easy it would be to make newer graphics effects work in XP, when they clearly have no idea what's involved behind the scenes.

lordjedi said:

 

I'm not trying to convince any of you to upgrade.  That's your choice if you want to or not.  But don't try to say that MS is forcing you to upgrade.  You don't have to do it.  Go use Linux or some other alternative.  No one's making you upgrade anything.

Huh? You've just made three long posts about how horrible a person I am for wanting XP to have some more support and for thinking Vista isn't absolutely worth the money. That certainly sounds to me like you're trying to tell us all what's a good or bad decision.

Nice try.  I'm not saying you're a horrible person at all.  What I am saying is that I think it's unreasonable to expect MS to support such an old OS for so long.  No other company takes the beatings MS does when they announce the end of retail sales and the coming end of support.

Also, nowhere am I saying that Microsoft is really forcing me to "upgrade" in any absolute sense (that's absurd and you're clearly not reading what I'm saying here). My criticisms with Microsoft (and supposedly "upgrading" their products) are far smaller than that.

I have no problem with old software dying when its time comes in the free market. It's all the little things that Microsoft does to influence that transition that bother me as a customer. The way that I can no longer purchase a new, decently priced copy of XP or some equivalent OS is another good example. Linux and the other OSs on the market are not an alternative XP. If there were truly an operating system being sold in the market that works and functions just like XP, then I can assure you that I would be purchasing that. However, there is no operating system on the market like it and I see no good reason why. Why is there no OS selling on the market that can comparably perform just like Windows XP (running all the same software in the same way and so on)? Another company wouldn't make money by selling a Windows equivalent?

Uh, OS X running parallels?  OS X has comparable software for everything except gaming.  MS Office is available.  I'm sure there's an accounting package available.  I  don't know about any CAD packages.  Most Adobe software is available.  For the occasional package that isn't available, simply fire up parallels to run the software under XP or dual boot with bootcamp and XP.  There are alternatives.  I'm not saying I like them (I don't), but alternatives are there.

Microsoft has to make a business decision to either keep supporting old products in perpetuity or move on to newer technologies.  XP and Vista are no different than any other product in the past.  And since Microsoft has shareholders to answer to, they have to do what's best for their business, not necessarily what people think is best for the market place.

Continuing to support ancient products (remember, XPs support hasn't ended, just retail sales) can actually be detrimental to a company.  Just take a look at Novell.  They supported Netware 3.12 for what seemed like forever.  In fact, they supported it for so long that people didn't bother upgrading.  Why upgrade when it works and you can still get support?  Now look at them.  They're a shadow of their former selves.  If they had not waited so long to end support for an ancient product, they may have been able to get people to upgrade and at least been able to compete.  Instead, they ended up basically dropping Netware and becoming a Linux company.

With Windows 7 on the horizon, Microsoft is continuing to move forward.  If you want to stay with XP, feel free, no one's forcing an upgrade.  But if you want to take advantage of the newer features and enhancements in Vista and Windows 7 (touch screen capabilities, mmm), then you'll need to shell out some cash and upgrade.

I'm honestly still on XP, but that's because I'm lazy and don't feel like going through the trouble of making a complete backup of all my stuff, formatting my system partition, and installing Vista.  I like Vista.  I like it a lot.  But right now, it's a little bit to much trouble to bother installing.  If something ends up requiring it, then sure, I'll install it no problem.  Just like I upgraded to Win98 way back in the day in order to have better USB support because I was going to work on a USB project.  Just like I upgraded to XP because Adobe Premiere wouldn't work on Win2k and XP had better support for FireWire.

EDIT: If you want XP so bad, go buy it http://www.google.com/products?q=windows+xp+professional&ie=UTF-8&oe=utf-8&rls=org.mozilla:en-US:official&client=firefox-a&um=1&sa=X&oi=product_result_group&resnum=1&ct=title

Post
#336921
Topic
Windows 7
Time
Tiptup said:
lordjedi said:

Your comment about newer effects not needing to rely on a driver model to work is laughable.

Okay. :)

DirectX 10 gives access to hardware-based effects that the latest version of DirectX 9 won't allow gamers to access. (DirectX 9 is the highest Microsoft will allow XP to go.) For Microsoft to claim that they can't support those effects in DirectX 9 is laughable. For you to claim it wouldn't be possible for them to "easilly" support those effects in DirectX 9 is also laughable to me. :)

Edit: Your other points are worthless to me. If you like the amount of money you have to pay to Microsoft to get your new hardware and software to work every three years or so, you can do that and I won't care. I expect a bit more support for a piece of software that costs me over a hundred dollars to purchase.

Your first comment is totally inaccurate.  Read what I wrote again.  DX9 gives access to the same effects, the difference is that DX10 makes those effects easier to do.  DX10 does have some more advanced effects, but games like Bioshock, Crysis, and Company of Heroes look the same in both DX9 and DX10.  Anyone that has taken screenshots has seen little to no difference.  There are a couple of games coming that will be DX10 only, but it remains to be seen how well they do.

MS is not claiming they can't support those effects.  As I've already said, DX9 supports those effects.  You may think backporting DX10 to XP isn't that hard, but you also don't know the code.  I've seen whitepapers from MS that show the difference between the driver models in XP and Vista.  Again, they would have to completely rewrite the driver model in XP in order for it to work.  It wouldn't be a little patch, it would be a major change, probably involving a service pack.  DX10 hooks into the drivers in a completely different way that is faster and more efficient.  XP is 7 years old now.  It's just not worth the effort to do it.

I don't have to pay anything extra to MS every 3 years to get my new hardware and software to work.  My current hardware works just fine in XP right now.  It'll work fine in XP 3 years from now.  My old video card ran in XP for 2 years and still works to this day.  Any new video card that supports DX10 will also support DX9, so it'll work just fine.  I didn't pay MS any money to make any of it work.  In fact, I haven't paid any money to MS since I bought XP back in 2001.

You expect more support?  Hey genius, try going to Apple and getting support on OS 9.  I bet they don't do it.  Getting support from MS for XP is the same thing.  It's an outdated OS that has run its course.  Just because you still think it's worth supporting doesn't mean it is.  I know people that are still running Windows 98.  I don't expect MS to support them anymore either.  Windows 98 is an insecure piece of junk.

You're all basically arguing against a company improving their product, which is totally funny.  A few years ago, you were probably bitching that Vista still wasn't out and that XP was getting old.  Now I hear the same complaints about Vista that I heard about XP when it was released.  I'm sure when Windows 7 comes out, we'll hear about how great Vista is and how much Windows 7 sucks in comparison.  It's pretty much a neverending cycle.

I'm not trying to convince any of you to upgrade.  That's your choice if you want to or not.  But don't try to say that MS is forcing you to upgrade.  You don't have to do it.  Go use Linux or some other alternative.  No one's making you upgrade anything.

Post
#336919
Topic
Blu-ray prices not coming down
Time
Moth3r said:

Talking of the Chinese; I read today that the pirates over there are now selling "fake" blu-ray discs - these are actually just 720p AVCHD discs on DVD9, in blu-ray style cases.

http://arstechnica.com/news.ars/post/20081117-fake-blu-ray-discs-hatched-in-china-industry-is-concerned.html

That article isn't entirely accurate.  AVCHD is just an encoding scheme http://en.wikipedia.org/wiki/AVCHD.  A Blu-ray disc can have mpeg2, VC-1, or AVC.  The "problem" with these releases is that they're ripping the movie from the disc and then down resing it to 720p, but that's not related to AVCHD at all (they could easily be doing it with VC-1 as far as I know).

They're obviously fake "Blu-ray" discs, but that's all the article got right.

Jay said:

Didn't you just kind of argue my point? If the "average movie watcher" doesn't perceive that much of a difference, why so much bitching about high prices? Just watch DVDs and be happy. Enough complaining about $300 Blu-ray players like they're some kind of outrageously priced luxury item.

American consumers have such an unbelievable sense of entitlement.

Because it's the latest and greatest thing and they need to "keep up with the Jones'".  They may not perceive a difference, but if everyone's telling them how great it is, then they want to get in on the action.  Unfortunately, they don't want to spend a lot of money to get in on it.  So right now, they're waiting for player prices to fall.

skyjedi2005 said:

Well the way i see it is people hate blu ray because they got burned supporting HD-DVD , replaced all their vhs movies with dvd and don't want to buy all those movies again.

There is a certain mentality that once you purchase a movie once you should own it for life and get free upgrades.

Uh, the "average moviegoer" didn't support either format.  The average viewer was waiting for one format to win "the war" so they could buy into that format.  Now that there is a single format, the "average" viewer is waiting for the price to drop.  They were told that prices were high due to the war and that once there's a single format, everyone will adopt it and the price will fall.  That didn't happen.  It's only just now starting to happen.

I don't know who you talk to that has this mentality of "but it once and get free 'upgrades' for life".  I don't know anyone that thinks that at all.  Everyone I know is ready and willing to buy the better format, we're all just waiting for the player prices to come down a little.  The Digital Bits has some articles about a Magnavox on sale at Walmart for under $200.  It actually has some good reviews too.  Being this close to Black Friday, I'll probably wait just a little bit longer and get a killer deal after next week.

Post
#336785
Topic
Blu-ray prices not coming down
Time
Jay said:

I feel like some people want something for nothing and won't be happy until Blu-ray players become $99 commoditized Chinese junk and bargain bins are littered with $5 titles.

DVD is still alive and well, so why complain about Blu-ray costing more when it offers more? Isn't that kind of the point?

First, you're probably right about the Chinese players, but so what?  My last DVD player was a cheap $40 cyberhome, but I didn't need anything more.  That didn't mean there weren't other, better players on the market that cost more.  So what if someone wants a cheap Chinese player?

Second, the only additional thing Blu-ray gives the average movie watcher is higher resolution, which they apparently aren't noticing anyway.  So the point, from their perspective, is why should I pay more for it when my DVDs look great on this upscaling DVD player?  Get Blu-ray players down to the price of upscaling DVD players (about $100 I think) and many more people will pick them up since 1) they'll look better than their upscaling players and 2) they'll cost the same amount.

Personally, I would never make such a comparison.  I wouldn't even mind spending $150-$200 on a Blu-ray player.  Unfortunately, I haven't found a decent one yet at that price point.  All the ones I've seen reviews for say they either have slow startup times or they just plain suck.  It's the more expensive ones that don't have a problem starting up in less than 30 seconds (anything that starts up slower than that is not acceptable).

I'd love to adopt Blu-ray, but for now, I've decided to stick with DVD.  If player prices come falling down in time for the holiday season, maybe I'll pick one up.