I think zombie84 nails it, that this is a philosophical discussion, and your position can vary depending on how you define your terms and what your basic assumptions are.
I think most of the people restoring classic films from original negatives would argue that "fidelity" means "closest to what was captured on film" rather than "closest to what was seen in theatres", for example. It all depends on what you consider "the film", and what you think of the various processes that happen to film between shooting and projecting.
If you think "fidelity" could include any source higher up the chain than prints audiences saw in theatres, then it becomes important to sort of what you think of those processes. I think the processes that happen to a film during can be separated into a few categories:
1) Changes that a filmmaker expects to happen and they take these changes into consideration during filming.
2) Changes that a filmmaker expects to happen but which are not taken into consideration during filming.
3) Changes that a filmmaker does not expect.
What falls into these categories, of course, can vary a bit from film to film, but #1 could include generational loss hiding detail or aspect ratio cropping, #2 could include reel change cue marks and gate weave, #3 could include tears, scratches, bad duplication, etc. Or to be more topical, Tantive IV's burn marks.
And there's the rub. Assuming for a moment that these issues can be so neatly segregated, I personally think #2 and #3 are okay to take out by using a higher-generation source, but #1 is not. How can you get some of these without the others?
Using The Wizard of Oz example, I think they should have used sources as close as possible to the original negatives--this would ensure a clean, undamaged print with no cue marks (color accuracy would be taken from whatever source best preserved it, not necessarily the earliest generation). However, taking out the wires was IMO a mistake. First, you have to ask yourself if the filmmaker really expected the wires to be hidden by generational loss--audiences at the time had no expectation of perfect visual effects, and seeing the wires would have been unremarkable to a contemporary theatre-goer. So if you land on that side, leave the wires in--no big deal. Or if you think, as I do, that there was just an industry-wide trust that generational loss would tend to hide wires, then the right solution would be to soften the entire film until the wires disappeared, simulating generational loss, but without the other unwanted effects.
I also think this is ultimately academic, because unlike a lot of film tinkering that goes on during Blu-ray restorations, this one is largely user-fixable. If you think your 1080p Blu-ray is too sharp, set your device to 720p. Still too sharp, go for 480p. Yes, you still have to take positive steps to achieve the results you want, but compared to getting rid of DNR or revisionist color timing, it's pretty easy. At least my player allows this, maybe it's not common?