Not a surprise; I feel like 90s CGI looked a lot less sore-thumb-y when timed for theatrical projection than it did on any official video transfers. Yes, people did still complain when CGI looked fake or rubbery (Spider-Man is a good example), but the different timing, higher contrast and overall "look" of 35mm theatrical printing seemed to hide the fact that CGI/digital composite stuff was rendered at a lower resolution and dynamic range than the live-action (at least before digital intermediate). In a lot of (low contrast, IP-sourced) video transfers of such movies, I've always felt like the dynamic range seems to take a hit when it switches from "pure" photochemically-timed live action to anything involving CGI or digital compositing; you suddenly get flattened off-white highlights and flattened grayish shadows.
For example, I remember noticing the duller contrast in every CGI shot when I saw the 2013 re-release of Jurassic Park (even projected in 2D), but in the 35mm preservation, the shots with CG look so much less jarring in comparison to the adjacent shots (and mind you, that is a movie where I've always held up the quality of the CGI, even though I've only seen it on video).
I also remember noticing as far back as 1999 that the direct-digital DVD transfer of A Bug's Life looked brighter, and that the textures looked more "plastic", than I remembered from the theater. (I remember thinking the characters almost looked like claymation at times - Tuck and Roll's surface textures combined with their designs kept making me think of the Chevron cars.) Of course, Pixar's texturing caught up, but I swear that these technical imperfections never occurred to me in the theater, even though I quickly picked up on them watching a DVD on a regular CRT only about a year later. On the other hand, the 35mm-sourced trailers in the extras seemed darker and more like what I remembered from the theater (again, I was thinking this in 2000, not just in 2015).
Does any of this make any sense to anyone, or am I just nuts?