No, grain is like noise. It's called generational grain. Most of the visible grain you see in many films is due to copying. Modern negatives started to look so clean that it actually encouraged a move towards adding visible grain to the emulsion to make them more obviously film-like, like Saving Private Ryan. But when you see 35mm, you aren't watching the negative. You are watching copies of copies. That's why you see grain, which in modern emulsions is very fine and normally not very apparent. Noise is a by-product of the camera sensors and grain is a by-product of the crystals that make up an emulsion. Since we are dealing with copies, this adds up in an exponential manner, which is why 35mm looks a little grainy and HD is capable of clean, noise-free, mirror-like images (i.e. The Hobbit) since the data is lossless. On a scientific level, it simply isn't possible for film to look like that.
Most HD shot films look noise free because they don't need to boost the signal, or you would see noise. That's what the "gain" switch does on high-end cameras, it artificially amplifies the signal and increases the noise floor (like that staticy sound on receivers and speakers). Film has the equivalent to a gain switch too, which is push-processing, which amplifies the signal (the silver halide crystals sensitivity) which in turn boosts the grain levels. In the early days of HD filming, we would leave the gain switch on just a tiny bit so the image looked more like film with that "gainy"/grainy subtle layer. People stopped doing that years ago because, with the widespread adoption of HD displays circa 2008-2010, people accepted the HD "clean" look and began to actually not want HD to artificially mimic the qualities of film.