I don’t understand why any of this matters. You liked the film or you didn’t, who cares how it got to the place it did?
I enjoy learning about the behind-the-scenes aspects of films. I particularly enjoy the drama and politics behind them.
EDIT: For clarity, I don’t mean politics like Obamacare politics, I mean the studios vs the filmmakers and how everyone worked together or didn’t work together, and that kind of thing.