James Cameron is full of shit. You can see the difference between a 35mm negative and 1920x800 with your eyes. What he's talking about is digital 4K camera capturing, which is different than scanning a film.
I saw Inception digitally a couple weeks ago and the picture quality was awful. Tons of artifacts, and a general softness. I forgot once the movie got going because its an absorbing piece of cinema, but once you start examining the image you can see how weak it is. Of course projection is not the same as capturing. Resolution isn't the be-all end-all for film, otherwise everyone would be shooting on 65mm and IMAX. Especially when it comes to video--don't forget, DVD and VHS have the exact same resolution, but one looks like shit and the other can be projected in a small screening room to pretty good quality.
However, as it relates to scanning a film from 35mm in 4K this is a separate issue. You can film something in 1920x1080 today and have it look pretty good because the sensors of the cameras have improved. The prequels look like shit not just because of the resolution, but because of the sensors, which were 2/3" CCDs, and with 4:2:2 colour. But when you start comparing scanning 35mm in HD and 4K its a totally differerent game. An HD scan simply cannot retain the same amount of picture information that a 4K scan can, because there is more information on the negative than 1920x1080 is capable of displaying. This talk about HD cameras versus 4K cameras is misleading because a telecine machine is not the same as motion picture camera.