The whole resolution debate is a bit of a smoke and mirrors thing, because film has no fixed resolution, and resolution is not what we should be measuring anyway. The measurement is resolving power, that is, how many lines of detail can be detected on the screen. You can have a "2K" image that looks like shit and won't be anymore detailed than a really high quality 16mm negative. And, you can have an "HD" image that blows away a 35mm original camera negative. I know, because I've seen both. So saying 4K>2K>HD doesn't even necessarily tell you anything, it's just a measurement of the fixed amount of pixels the image is composed of. That film is not a digital image drives this home, as their is nothing to count--instead you measure what the results are, and there is no standard result for each format. In the case of 35mm, it depends mainly on film stock and lenses (and to some small degree, the aperature, if there are filters used, etc.). And in the case of HD (or 4K, etc.) it depends on the camera sensor/electronics, and the lens (and again, to some degree aperature, filters, gain, other small things, maybe even settings like gamma preferences). So you just have to examine it on a case by case situation and judge by eye which one looks more detailed or has more picture information.
Generally speaking though, most typical 35mm negatives are about comparable to a typical 4K image, but this is a fast and loose rule. And, generally speaking, most typical 35mm positive release prints have about the same or slightly less than a medium-quality HD image. Especially since it was a popular film, the Star Wars negative and interpositives got dirty real quick, and a movie house played their copy over and over again, so by the time a lot of people saw it for the second or third time it probably would be beat by a 720p image in terms of clarity and detail.