Prepare for a lot of technobabble here - note that ASA is the exact same thing as speed with 35mm still film (100-"speed" vs 400-"speed" etc. - the number is the ASA).
The "resolution" of 35mm film depends on the ASA of the film stock used. A film with a low ASA like 64 or so would have a very high resolution - before it's been processed, probably around 15k (though we'd never see that). After it's been processed, the original negative could have up to 10k resolution. This is because there are many more film grains, that are much smaller (this also results in a very non-grainy image). The only problem with this is that you can't use this type of film for everything - you need a LOT of light to expose it.
Now, higher ASA's, like, say, 200 or 300, require less light, and are the most commonly used. These films have less film grains, but they're larger, so it makes the image appear grainier. Pre-processing, these would have roughly 12k or so resolution; post-processing, the original negative would ballpark around 8 or 9k. These films are more forgiving as far as light goes, but still can't shoot in very low-light situations.
Now, the much higher ASA's, like 500, 700, or higher, can work in extremely low light. They have even less grains than 200 or 300 stock, but they're even larger, so the image appears very grainy. Pre-processing, these would have around 10k at the absolute maximum (probably closer to 8/9k). After processing, the negative would have around a 5 or 6k resolution.
Now, these are original negatives, that have not been transferred to positive. A release print of these (using the traditional conforming process) would have about 3k-6k resolution. Today, though, many high-budget films scan the film negative at 2k or 4k, then print THAT back to film, which means the maximum resolution of that film, ever, no matter if you scanned it back from the new negative made, would be 2k or 4k, depending on how it was printed.
For example, the maximum resolution of "The Dark Knight" is 4k for the 35mm scenes, and 8k for the IMAX scenes, because they were scanned at those resolutions, edited in Avid at those resolutions, then printed back to film at those resolutions.
Also note the following: Entire films use a variety of different film stocks and ASAs, so one film will not have the same perceived resolution for every shot or scene.
Also note that these are estimates based on my experience and what I've been tought in film school so far (sophomore year now), and that film does NOT have a "resolution" per se. The maximum "resolution" of film is determined by scanning it at higher and higher resolutions until there is, finally, no longer a difference. It's generally accepted that the vast majority of people can't notice a difference between the same film scanned at 4k and 6k, so 4k is generally accepted as the "maximum" resolution of film.
But again, it is an analog format, thus has no real "resolution" - the only resolution there would be is the number of grains on a particular frame, since that's equivalent to a "pixel," but by the nature of the way film is exposed, some frames would have a vastly different number of grains than others, not to mention that actually counting the grains in a frame is pretty much impossible.
There - I hope that helped.
Oh, and about film grain - saying you're not a fan of film grain is like saying that you're not a fan of pixels in HD video. The only difference is that pixels in HD cannot change at all, while film grain is a chemical process that does change, even on the same shot.