They way he described it was that he was removing grain that was the result of additional generations. So yes, the grain is reduced/removed, but if he did it right, it would be the o-neg level of grain which was made worse with each generation.
That’s what he said but it’s not what he did.
However, as someone who watches quite a lot of 35mm projected I can tell you the grain you see in a scan, even a top-quality scan with no visible scanner noise, is significantly more than is apparent on projection. So it is quite reasonable to do some grain reduction to match projection.
So, the O-negative had badly faded by 1994, so it seems like that means that what Lucasfilm is calling the O-negative is actually partially (or mostly) comprised of new film printed in the 90s. We know parts of the o-negative were unusable, and other parts were destroyed in the cleaning process. Damaged sections were re-created from the separation masters, interpositives, and internegatives. Any shots containing CGI (including digital recomposites) were rendered in 2K and then printed to film in 1997.
So when they did the 4K scan, most of what they were scanning would logically have to be new film printed in the 90s, correct? Also, is it possible that the 1997 version contains more frames from the actual o-negative than the 4K scan? It seems like if there were any o-negative frames in 1994 that had survived with minimal fading, then the fading would have been even worse by the time they started working on the 4K version.
There’s plenty of material because they kept everything. The best material is the camera negative, then there’s the dupe positives, the dupe negatives, the separation masters, and so on.