logo Sign In

Robert A. Harris on Film Grain and Blu-Ray

Author
Time

This is a great editorial on something that has become a HUGE problem in the home theater industry: digital noise reduction.  Everytime I hear that phrase I shudder to think of the US Blu-Ray release of PAN'S LABYRINTH (which I sold to buy the European Blu-Ray at higher cost, but with much greater satisfaction).

Here's the full article.  Text reproduced below:

-----------------

DNR... and Other Things That Go Bump in the Night - Robert A. Harris

Blu-ray has won the high definition sweepstakes, with marketing that trumpeted the reproduction of the look of cinema. To that end, it has a huge amount of space to back up its abilities, which are many.

But within months of its taking over the high definition market, problems have set in caused by and affecting many people both in and outside of the industry.

The problems?

On one side, bloggers, critics and "reviewers," some with an affinity toward video games, who want "eye candy."

And on the other, studio executives who know about film, but who may be wincing in fear of each and every comment on line that accuses their releases of being "grainy," lest the precious discs not sell in high numbers.

Quality and Quality Control have been a problem since the laserdisc days.

Generally on VHS you couldn't see problems if they existed.

DVD changed all that.

Early on it was recognized that masters prepared for laserdiscs weren't going to cut it, and the studios did the correct thing by going back and preparing new masters for the less tolerant DVD playback system.

But even new masters didn't solve all of the problems. One major concern was film grain, and the havoc it played upon compression.

Vendors began to hang out signs for mastering, compression and other skills.

Some with all of the requisite bells, whistles and tech speak, but not much more experience than one might find at Ray & Irwin's Garage.

Studio asset protection staff would create a quality master, occasionally from newly minted film elements, only to have work destroyed in the end by vendors contracted to do a job without the requisite skills.

Two of the more notorious that come to mind are the standard definition DVD releases of Cold Mountain and Gangs of New York. Both digital abominations from quality masters.

Which brings us back to The Great Grain Debate.

Film Grain has been with us now for well over a century and a half, and was never looked upon as "the enemy" until a few "reviewers" and bloggers got it into their heads that it was somehow in the way -- keeping them from seeing the true image.

Grain is an inherent part of the film image.

Remove it at your peril, as layers of real problems may then arise.

Do the DVD buying public and studio executives truly believe that Chapin, Keaton, Ford, Wellman, Welles, Hitchcock, Lean, Mamoulian and others were a bunch of hacks?

Is it possible that Bitzer, Burks, Young and Toland had not a clue, and need some lowly digital tech to clean up their errors?

Have generations of scientists in Rochester had no clue about the horror that they were handing down to us?

I don't think so.

Film Grain has been a known entity and a part of the design of film from the beginning of photography. It has merely gotten smaller and less obvious over the decades.

Everyone involved, from the scientists who created the emulsions, to the production designers who created the sets, the make-up artists who gave the actors their "look," the costume designers who dressed them, the cinematographers and camera operators who exposed the raw stock, the processing labs, effects experts, optical camera workers and finally post-production executives who put it all together...

really did know then, and still know today, what they were and are doing.

We have a film history that goes back 104 years.

Why, all of a sudden, is it left to people who wouldn't know which side of a camera to point toward an actor to totally re-write the history and look of our cinema?

And this is precisely what is at stake.

Don't get me wrong.

I'm more than aware that a certain amount of grain reduction is occasionally needed for acceptable compression in standard definition DVDs. Grain can be digitally manipulated to form a more cohesive image, especially when dealing with a myriad of elements sometimes needed to create a quality master from old or damaged film elements.

And then came Blu-ray, and along with it, a great piece of hardware -- the popular price leader -- PS3.

Early on, one of the major selling points for the Blu-ray system was a higher capacity disc as compared to HD. This meant that not only could more data be encoded to a disc, but that compression rates could be lowered, and by that I mean more bits per second making their way through the pipeline.

There have been some glorious Blu-ray discs produced of catalog titles. Discs that allowed the full cinematic beauty and grain to survive the transition.

Think Bullitt from WB, The Sting from Universal, and Reds from Paramount, and more recently The Sand Pebbles from Fox, The Professionals from Columbia (Sony). All of these films were shot beautifully, with fully exposed negatives. Add to these more problematic productions like Columbia's Baron Munchhausen and Dracula, films with a courser grain structure, and a slightly softer feel to their imagery.

Some of these beautifully replicated films were hit by the backlash of cinematically less sophisticated innocents.

Feelings are made known, and the message goes through the studio food chain like wildfire.

Which is the long way 'round in finally bringing us to one of the newest and most discussed Blu-ray releases of 2008.

And it is being discussed for all the wrong reasons.

A film, photographed on Eastman Color Negative 5254 on 65mm 5 perf, through some of the finest glass imaginable by a great cinematographer.

The Best Picture of 1970.

Franklin J. Schaffner's Patton.

Allow me to state up front that this isn't really about Patton, and it isn't about Fox. Nor is it about any of the vendors who worked toward bringing Patton to Blu-ray. It is about film in general, and how our heritage may be viewed and survive in the future.

Patton is merely the current poster child for how it should not be handled.

This is a difficult piece to write, as there must be a balance between passion and compassion.

Passion toward the concern that our film heritage is in jeopardy, and compassion toward those individuals involved on both sides, in a virtual tug of war.

The studio people making decisions are intelligent. They want to create top quality software, service the needs of the public, and have that software jump off store shelves.

No one wants to create a problematic disc.

The folks on line and in print are just as concerned about what they see and what they may perceive to be a problem. And they too are an intelligent bunch.

Fact: The apparent grain structure of Patton is approximately 40% of that of a normal 35mm 4 perf production. It should not need to be touched in preparation of porting it over to Blu-ray.

Fact: We now have extremely high quality means of extracting an image from a piece of 65mm element, be it scanning in 8k or capturing the image directly to HD.

And this all should have come together to yield what many hoped would be one of the finest Blu-ray releases of 2008.

Some believe that it is.

Anyone recall my mentioning that a major selling point of Blu-ray was its ability to reproduce the look of cinema in the home theater?

Patton didn't.

Blu-ray, as a system, failed the acid test.

But why did it fail?

Was it the fault of Blu-ray?

Partially.

There seems to be no protective mechanism in place to weed out questionable work, which at this early stage of life will make Blu-ray look like gaming software, rather than software designed to reproduce motion pictures.

Think an entity like THX or TAP.

As an organization Blu-ray has failed because it allows software that does not deliver their promise of quality to hit the marketplace. This is something that should have been in place since Day One.

Is this the fault of the "grain reduction" vendor?

Not really. Someone hired them to do a specific job and approved their work.

Is it the fault of the home video division?

Partially.

But this is where it gets difficult. Let me re-state a fact.

No one sets out to create a problematic Blu-ray release.

Everyone is trying their utmost to create a superior product.

However, the wrong people may be adding their thoughts to what film should look like. This becomes a problem only if they have no idea what it should look like.

As cinematographer Gordon Willis said: "The film has already been made."

"The job is to reproduce it."

And that should be simple, except that there may be too many involved in the creation of the final result.

Some executives are fearful that the disc they release might not be a hit, or might get poor notices from bloggers. Because of this, some have begun to listen to people who have no idea what they are preaching.

This isn't about what anyone likes or doesn't like.

It is about the intentions of the filmmakers.

So what's the upshot?

Someone at the studio makes the decision that "our films can't have grain."

A well-meant decision…

but wrong.

What is grain anyway? What are they thinking?

Grain, they feel, must be something that is obviously not a part of the exposed image; that it somehow makes its way onto the film as it is either duplicated or ages.

But, if the grain grows as the film emulsion ages, then it should (must) be removed so that the crystalline clarity of the original can shine through on guess what…

Blu-ray.

And hence, the perceived grain "problem."

Can you scrape it off?

Tried that.

Doesn't work. Somehow it has attached itself to the image. Think Facehugger.

Can you wash it off?

Nope.

What if we print it to dupe from the other side of the film.

Better.

But then things don't look quite as they used to.

What's the answer then?

Digital!

The concept makes the rounds with executives making possibly correct decisions, but based upon a flawed set of facts.

I'm not suggesting these aren't good executives, just bad facts.

There are digital facilities willing to remove the unwelcome grain.

These facilities are all over the world.

Some are extremely capable, others less so, and some -- not at all.

And this is where perfectly reliable and sometimes capable facilities do the bidding of concerned and wary studio executives, and...

They remove the grain.

Let's get down to the basics.

Is it easy to remove grain?

Yes.

At its most basic, even a child can do it.

For those who have ever projected film or slides, the answer is simple.

You get rid of grain by throwing the image out of focus.

Not blatantly out of focus, but marginally… ever so slightly.

Then you add a bit of digital sharpening, a touch of gamma, and a bit of basil.

The final product?

Grain reduced… or gone.

Occasionally pretty, and if no one compares it to the original, quite acceptable.

Although every digital facility promises grain removal, and some have a quality product in incremental stages, I've personally only seen the work of one facility that to my eye has the capability to remove or reduce grain and not affect resolution, and by that I mean the removal of a large chunk of high frequency information along with the offending grain.

Which brings us back to Patton.

An element is scanned and captured digital, and an edict goes out from someone at the studio to "remove the grain."

The vendor goes about the work of removal, and at some point someone needs to inspect and approve the work, but here comes the next problem.

Professional monitors are notoriously small -- generally less than 40" or smaller.

I viewed Patton on a 30" Sony HD XBR CRT, and the image looked glorious. The information was so compacted, it was difficult to tell that anything was missing.

Only later, when I viewed it on a larger screen, did it become apparent that all was not well.

Faces were waxy, background detail was gone, clothing, walls, dirt on Jeeps was all missing high frequency information, and the image appeared dead, much like a video game.

This is where it gets more complex.

Because this isn't really just about grain or the removal of grain.

One can reduce grain and have a perfectly acceptable Blu-ray --

just with less grain.

The problem is grain reduction gone wrong.

The removal of all high frequency information, and the destruction of the image.

This is the problem with Patton.

Had grain been reduced, rather than removed, and had it been done properly, without the loss of detail, all would have been well.

But someone approving the work would have had to recognize the problem.

Patton has arrived on Blu-ray.

And reviews, across the board have been generally excellent.

Great for the studio.

Less so for the film.

Why?

Because some people are "reviewing" discs on systems inappropriate to the task, and without the necessary background or reference to even have a proper opinion.

While it's perfectly fine for someone to say to a friend "Gee, that looks pretty," it can become problematic for that individual to broadcast their thoughts…

"GEE THAT LOOKS PRETTY!" around the world…

and onto the computer screens of concerned executives that, quite rightly, want to keep their jobs.

The people who made these films cannot be happy.

At worst it makes it appear that cinematographers, camera operators, focus pullers are incapable of doing their jobs.

Where do we go from here?

A few suggestions:

The folks behind Blu-ray need to take a position.

Is their system to be used as promised, to give the home theater enthusiast the cinema experience?

Or will our film heritage hence forth look like video games?

Studio executives need to be educated about grain, whatever it is that makes up an image and how it gets to Blu-ray, or sit back and allow someone else to deal with the technical end of things.

I can tell you as an absolute, that every studio has someone in place that can do this.

Those who call themselves reviewers, whether they be bloggers or work for high-end magazines and newspapers, also need to be educated as to what film and video can and should look like.

Let's be quite honest about this.

Ray and Irwin's Garage & Blu-ray Disc Review is probably a combination that won't work in the real world, especially when tied in to the Internet.

Can this be fixed?

Easily.

But only if people are willing to listen, to turn to their own technical people and understand that the thousands of people responsible for the films in their libraries really did know what they were doing.

Can the studio fix Patton?

Since this is now old news, I would bet that they're already working on it.

They have great people in place, and the ability to correct, perfect, recall and replace.

No harm done.

But the point is to not do it again. That is the concern. As I said, this isn't about Patton, and this is not about Fox.

This is about harmful and improper grain and high frequency removal that can have a horrific affect on catalog titles from every studio and copyright holder across the board.

That is the concern.

Not Patton.

As an aside, I'm pleased to find that in recent days, as discs get to consumers, that I'm not a lone voice in the quest for film looking like film. More and more people who "get it" are adding their thoughts to web forums and blogs, including some on-line reviewers.

I can hear it now. So he likes the bloggers that agree with him.

Of course. Because I'm right.

A recently posted comment from "Xylon" on the AVS Forum states a clear position:

"The Blu-ray release of Patton may give us a glimpse of what could/has happen[ed] when studios cater to the masses. A revisionist piece of cow dung (!) that only they could like. This is not cinema. This transfer is not Patton. This is not the same movie I watched.

The Blu-ray format with all of its 50GB disc space and bandwidth is useless if the movie put on it is not representative of what was shown at the theaters. If you really have to use DNR and EE to cater to the lowest common denominators… put them on the players. Let them switch it on. As for film lovers, that means us, you know the early adopters?, the ones that spend thousand of dollars on your hardware and software. Take care of us. Restore the movie according to the filmmaker's intent.

To those people who have been asking me in recent days if it's worth the purchase, I will say no. Don't reward the studios with this release. Renting it is the best I can recommend. -- Xylon"


Can we just do nothing and allow a negative trend to continue?

I would hope not.

The point that the folks who hate grain need to understand is that while it can be reduced if absolutely necessary to create a quality piece of software, it need not be eliminated completely, and most important, high frequency information must remain inviolable.

More people are likely to see our cinema heritage on Blu-ray than in theaters in the future. It is precisely that heritage that is now in jeopardy.

Once again, cinematographer Gordon Willis said it best:

"When people see things they don't understand, they become frightened, and the concept of what [a film] is -- or was -- still eludes some people. We all tend to reduce or expand things to a level we understand, and it can be fatal to a film… if what someone understands is Petticoat Junction.

The film has already been made."


To those who might wonder why I choose to make Patton the poster-child for poorly used DNR on a Blu-ray disc, the answer is simple.

With all of the classic or catalog titles thus far released via the Blu-ray format, Patton, because of its great quality as a film and huge technical benefits, had the most to gain.

It could easily have been a superb example for showing off the Blu-ray system.

In place of what might have or could have been, we have, in terms of its raw potential, a disc that could have been extraordinary, but is far from it.

And this is very, very sad for any number of reasons, inclusive of the ultimate marketing damage to Blu-ray as a system for the dissemination of quality cinema in the home.

What we need is a level of quality control within the Blu-ray organization that achieves a standard that permits consumers to purchase a Blu disc knowing that the quality will be of the highest order.

Ultimately, and most easily, the grain situation might come down to reading the directions. Want less obvious grain?

Turn down the sharpness on the Blu player or monitor. The controls are there waiting to be used.

RAH

Author
Time

 

I couldn't agree with him more. I've actually been thinking about this issue a lot since the last "film grain" thread.

Film grain is a product of actual, real light hitting actual, real molecules. It's a record of reality that is a reality itself that contains real information and any change of that information must result in a loss of that information.

I don't care how good a studio supposedly is (Lowry or whoever Zombie mentioned in the other thread), removing film grain will always destroy information regarding the random movement of light and the random behavior of the film as a physical medium (chemically speaking). Supposedly preserving that kind of immense information in a way where we put stuff "back where it belongs" is impossible.

At best we can only simulate what "should" be where and that just seems wrong in principle to me. Light is already where it's supposed to go and chemicals already move as they're supposed to move. I don’t care if we use a chemical or more electronic method to capture light-based visuals, there is no way to escape physical reality to the point where we can make a completely perfect image. Even if our goal is to merely simulate what an existing image would have looked like using a more accurate process, how do we define accuracy and perfection apart from another reality that is physically connected to it? A computer algorithm isn’t physically connected to the world which produced that image and it must destroy the information contained within grain and replace it with something else. That doesn’t seem right to me.

Even if I were okay with simulated information, I still can’t believe we have a sufficiently competent method to simulate the immense depth of information which can be recorded by film from frame to frame. Certain simulation techniques may look great, but I can’t believe the same level of complexity is preserved.

Lastly, film, like any visual medium, has its own unique beauty in the way it physically behaves. That beauty may get in the way of portraying certain visuals in the way an certain artist or viewer would like, but once it’s chosen as the final medium, it shouldn’t be supposedly improved later on without making it clear that we’re making a new work of art (as apposed to “improving” an old one).

"Now all Lucas has to do is make a cgi version of himself.  It will be better than the original and fit his original vision." - skyjedi2005

Author
Time

DNR is different than what Lowry does, keep in mind. He's not railing against Lowry, but DNR. Digital Noise Reduction just smears the image to create the illusion of smoothness. Lowry actually doesn't destroy information per se--though Lowry should never be used to destroy grain, unless it is dupe grain. Lowry is a dirt removal algorithm, and dirt, generally speaking, is not part of the photography but foreign substance physically attached to the film through age/use and deservedly should be removed--its basically a more gentle way of running film through a cleaning bath. But it can--and has--been abused to remove grain that is part of the photography. Generally though, Lowry and other clean-up software of its type are our friend--it should not be confused with DNR, which doesn't solve anything.

Not to take away from the general sentiment--this anti-grain thing is perplexing and frustrating. Film should not look like a digital rendering because its not a digital rendering, its film. Grain looks beautiful IMO, just like an oil painting looks more aesthetically pleasing that a crystal-clear digital photograph of the same scene--disagree if you like, but its an aesthetic element to motion pictures and it should be respected.

Author
Time

This is going to be an issue as big as anamorphic widescreen; every format has a hurdle to cross, and I think film grain will be the major issue this time.  Unfortunately, it will be harder to get the public on board with.  As Harris says, the public wants a squeaky clean image, something that looks like it was made today and shot digitally.  This is just not the case.  I also think people get "grain" confused with "dirt."  Dirt should be removed.  Grain should not. 

Watch DarthEvil's Who Framed Darth Vader? video on YouTube!

You can also access the entire Horriffic Violence Theater Series from my Channel Page.
Author
Time
zombie84 said:

Lowry is a dirt removal algorithm, and dirt, generally speaking, is not part of the photography but foreign substance physically attached to the film through age/use and deservedly should be removed--its basically a more gentle way of running film through a cleaning bath. But it can--and has--been abused to remove grain that is part of the photography.


    Ahh, I misunderstood. All I remembered was a comment about using Lowry's technique to supposedly make Citizen Kane crystal clear. In that case, it wouldn't be the same film anymore, but a simulation of what a clearer version would look like instead.

    Dirt removal is good. But, is Lowry's process an actual cleaning/scan that directly interacts with the dirt? Or, is it merely a visual algorithm that compensates for the dirt? In the case of the latter, you'd still have to simulate what the image looked like before that dirt accumulated and it wouldn't be a precise view of what's underneath. (Though, if that's the only option to see the film in way that's closer to its original, clean state, then I'm all for it despite the tiny amounts of information loss.)

    I guess my biggest issue is that simply because a certain amount of information (in an image) isn't precisely recognizable as a sharp object, that doesn't mean it is bad or useless information. First, it can be aesthetically pleasing and, second, it is coherent in the sense that it came from real objects and they are both worth preserving. To use this technology to try and restore what the films looked like originally doesn't bother me as much, but a technique that does this cleaning with the least amount of information loss would be preferable.

"Now all Lucas has to do is make a cgi version of himself.  It will be better than the original and fit his original vision." - skyjedi2005

Author
Time
Tiptup said:


    Ahh, I misunderstood. All I remembered was a comment about using Lowry's technique to supposedly make Citizen Kane crystal clear. In that case, it wouldn't be the same film anymore, but a simulation of what a clearer version would look like instead.

 

In the case of Citizen Kane they did screw it up; it was their very first project and I guess they hadn't worked out the software completely, or even what their approach would be. In any case, their Citizen Kane looks very good, but there's almost no grain, its practically video-like. But in the case of movies like the James Bond's or Raiders of the Lost Ark, made much later, they did it as they should. There's the natural film grain, but the dirt is gone. In the case of Star Wars there were some problems, but there was also an aesthetic choice from Lucas I am guessing to make the film look as clean and sharp as possible, so it doesn't look exactly as it normally should in terms of the film quality itself. But its not that far off; its still pretty close to the quality of the original negative. Star Wars should not look like it does in the GOUT, this is the original negatives here and generally there should not be very coarse grain; if you look at the SE of ANH you can see the grain of the emulsion, so nothing's been erased, or at best very little has been minimized.

    Dirt removal is good. But, is Lowry's process an actual cleaning/scan that directly interacts with the dirt? Or, is it merely a visual algorithm that compensates for the dirt? In the case of the latter, you'd still have to simulate what the image looked like before that dirt accumulated and it wouldn't be a precise view of what's underneath. (Though, if that's the only option to see the film in way that's closer to its original, clean state, then I'm all for it despite the tiny amounts of information loss.)

 

Lowry's process is not necessarily dirt per se, but thats what its intended for and what its used for. Basically, dirt looks different than emulsion grain so the computer can differentiate between whats on the emlusion and whats a foreign body on the negative. What it does is look at the frame before and the frame afterwards to reconstruct what the frame between them with the dirt actually looks like and then it erases the speck of dirt and paints in the pixels of what should be there based on the preceeding and following frames. This only works for small pieces of dirt, that sort of speckled dirt that just comes from wear and tear and age; if you have big globs of it or big tears and damage in the film I'm not sure how useful the program would be, so Lowry also has a team of rotoscope artists that hand-paint out the larger dirt/tears/damage, etc.

This process isn't unique to Lowry, I should point out. Their computer program is paricularly effective, but there are a lot of similar programs, and these techniques are pretty much standard now for serious restoration efforts. Criterion, for example, puts their films through a similar but not as aggressive process.

Author
Time
 (Edited)

To me there is nothing so stupid as digital film grain. Artificial grain added after the cleaning process of an old movie or for just looks.

The lowry Bond and Indiana Jones removed all the fine grain as well as the dirt so they put digital film grain over the entire image, makes me want to throw the dvd's in the trash

 

As for star wars their is a reason Lucas wanted to remove all the grain so that the films would better match the new cgi and the prequels.

A lot of the removing of grain from films is because people want a pristine clean 21st century looking image that looks like it would have been shot on HD Video.

“Always loved Vader’s wordless self sacrifice. Another shitty, clueless, revision like Greedo and young Anakin’s ghost. What a fucking shame.” -Simon Pegg.

Author
Time

"The lowry Bond and Indiana Jones removed all the fine grain as well as the dirt so they put digital film grain over the entire image, makes me want to throw the dvd's in the trash"

 

Where did you get this info from? I find it a little hard to believe.

Author
Time

"A lot of the removing of grain from films is because people want a pristine clean 21st century looking image that looks like it would have been shot on HD Video."

You say that like it's a good thing.

Author
Time

I'm an advocate of film grain and the best HD stuff I've seen is where they even carried over the film-grain. I loathe DVNR'ing and EE'ing too. If given a choice if I entered the film business, I'd always shoot on celluloid rather than digital.

Author
Time

You're in agreement with every cinematographer in existence.

Now if only the young directors (and certain older ones) would think that way.

Author
Time
ChainsawAsh said:

You say that like it's a good thing.

 

I think it is. Obviously, a lot of the posters here may disagree, but I'd prefer a mere simulation of what it would look like without grain to a picture with a bunch of artifacts that are nothing more than an unavoidable side-effect of film-making. That said, I want the process of removing such artifacts to extend beyond merely blurring the image. It only looks good to me when it's been done by frame by frame, practically by hand, like the OT DVDs. Unfortunately, that's incredibly expensive and time-consuming, so we often end up with what Harris describes. Sucks, but that's life for you.

Author
Time

Wow.  Johnboy and DarkGryphon actually disagree on something.

“Always loved Vader’s wordless self sacrifice. Another shitty, clueless, revision like Greedo and young Anakin’s ghost. What a fucking shame.” -Simon Pegg.

Author
Time
skyjedi2005 said:

Wow. Johnboy and DarkGryphon actually disagree on something.

 

Err... do we usually agree? I haven't been keeping score.

Author
Time
 (Edited)
Johnboy3434 said:

ChainsawAsh said:

You say that like it's a good thing.

 

I think it is. Obviously, a lot of the posters here may disagree, but I'd prefer a mere simulation of what it would look like without grain to a picture with a bunch of artifacts that are nothing more than an unavoidable side-effect of film-making. That said, I want the process of removing such artifacts to extend beyond merely blurring the image. It only looks good to me when it's been done by frame by frame, practically by hand, like the OT DVDs. Unfortunately, that's incredibly expensive and time-consuming, so we often end up with what Harris describes. Sucks, but that's life for you.

 

The flaw is thinking of them as "artifacts," since they are not. They are the physical makeup of the medium used to capture the image, just like when you look at a painting you can always see that it is dried semi-viscuous liquid moved and arranged by the movement of a brush. Would you like to have all "artifacts" (ie lines of paint, uneven quality to the layers applied, and visible and semi-visible brushstrokes) in every painting ever made to be erased too? This is fundamentally flawed reasoning, and a severe and fundamental misunderstanding of the medium of motion pictures. They are not artifacts. They are part of the medium and the art. They should not be tampered with. This is exactly what Harris is talking about, this flawed reasoning that grain is somehow "artifacts" in the way of seeing the real, clear image; its scary to realise how many people share this viewpoint. Thats what happens when afficionado-based mediums go mainstream; when I see the first "fullscreen" Blu-Ray, I might actually cry.

Author
Time
zombie84 said:

Would you like to have all "artifacts" (ie lines of paint, uneven quality to the layers applied, and visible and semi-visible brushstrokes) in every painting ever made to be erased too?

 

Sure. Why not? If it makes the objects in the picture more prominent (as opposed to the stuff that's only there because of the way it was made) then I'm all for it.

 

Author
Time

But if it wasn't made that way, it wouldn't be there at all, now would it?

Author
Time
ChainsawAsh said:

But if it wasn't made that way, it wouldn't be there at all, now would it?

 

So a piece of art is worthless if it isn't kept exactly the way it was made to begin with? Rubbish. Everything can be improved. We can only fight about what qualifies as "improvement". I think grain removal is an improvement. You do not. Such is the nature of aesthetics.

Author
Time
Johnboy3434 said:

ChainsawAsh said:

But if it wasn't made that way, it wouldn't be there at all, now would it?

 

So a piece of art is worthless if it isn't kept exactly the way it was made to begin with? Rubbish. Everything can be improved. We can only fight about what qualifies as "improvement". I think grain removal is an improvement. You do not. Such is the nature of aesthetics.

 

But not respect for what the art actually is, nor the way it was made, nor the way the artists, regardless of intention, actually made it. And thats the point. Your opionion on what looks better means fuck all because thats not what the argument is even about, its about respecting the historical truth of the way the work was made.

Author
Time
zombie84 said:

But not respect for what the art actually is, nor the way it was made, nor the way the artists, regardless of intention, actually made it. And thats the point. Your opionion on what looks better means fuck all because thats not what the argument is even about, its about respecting the historical truth of the way the work was made.

 

It is the Mona Lisa, and after removing brush strokes and the like, it will still be the Mona Lisa. I believe intent is far more important than reality when it comes to art, especially when it's the creator's intent. If you want to respect the historical truth, then keep a written record of what it was like. Or take a fucking picture before the changes are made. Or release a low quality version of the original as a bonus to the revised version.

Author
Time
Johnboy3434 said:

zombie84 said:

But not respect for what the art actually is, nor the way it was made, nor the way the artists, regardless of intention, actually made it. And thats the point. Your opionion on what looks better means fuck all because thats not what the argument is even about, its about respecting the historical truth of the way the work was made.

 

It is the Mona Lisa, and after removing brush strokes and the like, it will still be the Mona Lisa. I believe intent is far more important than reality when it comes to art, especially when it's the creator's intent. If you want to respect the historical truth, then keep a written record of what it was like. Or take a fucking picture before the changes are made. Or release a low quality version of the original as a bonus to the revised version.

 

Or DONT CHANGE IT IN THE FIRST PLACE SINCE IT NEVER NEEDED TO BE "IMPROVED."

Author
Time
zombie84 said:

Or DONT CHANGE IT IN THE FIRST PLACE SINCE IT NEVER NEEDED TO BE "IMPROVED."

 

In your humble opinion, right?

Author
Time
 (Edited)

No, Johnyboy--this whole issue is framed in terms of YOUR humble opinion. Maybe grain should be erased and maybe brush strokes should be erased--in your opinion. For you, that is a preferable aesthetic. But why should your preference dictate how everyone sees it, or how the art, irrespective of ANYONE'S preference, should be presented. Instead of appreciating what the art IS we want it to be the way we want it to be. But why is your way "the best" way or an "improved" way? You open up a slippery slope that ends in a kamikaze dive-bomb. Maybe to me the best presentation is that everything should be presented in green tint, and have Pikachu from Pokemon somewhere in the frame because he's awesome; "everything can be improved." Maybe every film should be fullscreen, because this is how I prefer it, and maybe all black and white movies should be colorised, because this is how I prefer it, and maybe all silent films should be dubbed into sound, because this is how I prefer it.

And maybe all movies should end happily, because I hate sad endings, and maybe challenging messages should be edited out because I don't like to think too much when I watch movies, and maybe there should be comedy in every film because I find comedy most entertaining of all and films are designed to entertain above all else.

These arguments are illusions because it doesn't matter what I want or prefer. Films are what they are, and films are made on film, they have grain because they are made from silver halide crystals that react to light, they have flares because they are photographed with tubes of glass, they have bad acting and bad special effects sometimes because of the limitations of the skill of the players involved, and often they are made in ways in which we would prefer they be made differently. You have to respect what they are, not what YOU wish they would be.

They are objects that were created by people at a certain point of time and stand as artifacts of that effort across that point in time. They are what they are, not what you think they should be. Its this infection of this "me" generation that thinks everything should cater to them that is destroying the medium.

Author
Time

I couldn't have said it better myself.  Very well-thought out response, and I agree 100%.

Author
Time

Guidelines for post content and general behaviour: read announcement here

Max. allowable image sizes in signatures: reminder here