I think CatBus has covered all of the details on this, so I’m not going to repeat what he has said.
Yes, a 70mm blow-up will probably be of better quality than a 35mm release print, but given that 4K is already probably overkill in most cases for the quality of materials available outside the OCN, any difference will probably be very hard to see, even with an 8K TV. Don’t forget it is an optical enlargement, so while it may improve grain and detail, it is still a generation away from the source.
CatBus’ example showing 4K83 against Harmy’s Despecialised which uses the official 4K which was scanned from the OCN, shows the quality different that even the best prints (In this case a very high quality ‘Show Print’) is up against. I am by no means an expert on these things, but the difference between a 35mm Show Print and a 70mm Blow Up will still br minimal at best.
Most Hollywood films today are still only scanned at either 4K or perhaps 6.5K for super-sampling down to a 4K Master.
2001 was shot on 70mm, as was ‘My Fair Lady’ which also got an 8K restoration. It makes sense to scan native 70mm OCN at 8+K because the frame is larger (Scanning gives you a dpi resolution, so like 3D printing, to maintain a high dpi, if your input frame is larger, so must your scanning resolution). It also gives you the option to Super-sample the image down after acquisition.
The quality difference between a 4K scan of a 35mm print or 70mm blow-up vs an 8K scan of the same print will be marginal at best, and probably ‘invisible’ to the human eye by the time it’s compressed for home viewing.
An 8K DPX 16-bit File clocks in at 4.756GB/s or 16.69TB/hr so you’re looking at in the region of 33.37TB for Star Wars alone. Once you’ve done your restoration work, which will entail at least a second copy of this data (source and destination) you could generate an Apple ProRes 4444 XQ 12 bit which is only 810MB/s or 2.78TB/hr.
In comparison to 4K DPX 16bit where your data rate is only 1.19GB/s or 4.17TB/hr and the resulting ProRes is 202.50MB/s or 711.91GB/hr, I honestly don’t think the quality difference would justify the 4x increase in data required.
And what bitrate are you going to play back this footage on your 8K TV at? Encoding is inherently a lossy process if you are looking to reduce file size by a meaningful amount, and the only way to compress something like that would be to lose the ‘extra’ detail you’ve potentially captured in the first place.
Most films in the last 30-odd years have been finished as a 2K DI, and are therefore upscaled to 4K for UHD Blu-ray etc. Most studios only started seriously doing 4K DIs in the last decade, and less for Visual Effects heavy films. Marvel films in particular are 2K DIs until fairly recently, in fact ‘The Marvels’ was still a 2K DI according to IMDB.
Outside a few edge cases and Live Sport, 8K will remain something that content owners upscale to, or leave to the device to handle like CatBus suggests. Far easier and cheaper to improve the upscale quality in the final device than bloat the entire workflow for a minute portion of the market (“According to Omdia, shipments of 8K TVs only accounted for 0.15% of all TV shipments in 2021. This translated to a little more than 350,000 units globally.”)
Out of interest what ‘native’ 8K content will you be testing on your new screen?