0 Members and 2 Guests are viewing this topic. Read 7912 times.
This thread is intermingling projector, HDTV flat panel, and PC monitor users. And sources off disk, broadcast, and streaming. Perhaps best to specify some details when relating our personal observations.My experience tends towards Tyson's "it's not 4k that's the game changer, it's HDR". However I will hasten to add I have respect for OlesonMD's and Consumer Reports' tests. At many screen sizes and distances when "identical" source material is used it can be hard to spot the differences between 1080P and 4k when looking a the same movie. Where I fault the discussion is that "identical" is almost never that. To really do a peer reviewed science publication quality test on the limits of human visual perception of resolution one would need to control both the screen and the source. The projector or panel needs to be able to bypass all video processing, And the source needs to be an uncompressed 1080P or 4k feed. Out here in the real world with displays and sources available to us regular folks every display has what used to be super computer processing power re-sampling and reformatting the image to the native pixels it has. And copies of the same movie on DVD, Blu-ray, or Blu-ray 4kHDR has its unique mastering history. So if I compare 'The 5th Element' off DVD vs Blu-ray vs Amazon 4k UHD streaming and say 'yes' or 'no' I see a difference am I commenting on the format, or the display+signal chain, or source material? I maintain its a comment on all three of those elements.My personal experience is a couple of months ago my beloved Panasonic 55" 1080P plasma died . Replaced it with a Sony 65" LED with full local LED back light dimming, 4k, and HDR. If I lived alone I would have gone OLED but I was tired of nagging the family about screen burn-in. We are a streaming household and most content is still at 1080p. I have not invested in a 4k disk player because we rent video content and do not buy it. When I get my videophile on and pull some 4k content off YouTube or Amazon it looks stunning. My guess is half that stunning picture is due to my shiny new Sony and half is the 4k and 8k video cameras used to create it.One area where 4k HDR display surprises is my digital photo collection. I have a media PC and set it to run slide show screen savers. Occasionally a photo I know quite well will display with colors and details I was completely unaware existed on that image. That is the big reason I think HDR and the display panel engineering behind it is the big deal here.
The real test would be a double blind viewing. Have two displays, same TV's, same program material, but one 4K UHD video, and one showing a 1080P Blu-ray. Sit 6' from your 55" screen, and pick out the 4K.
As is typical of most things audio-visual related, there are hidden factors that may play a bigger role in why 4k doesn't seem that much better than 1080p. One of those is that a lot of 4k source material was mastered in...2k. That's right. Your 4K UHD Blu ray is probably a 2K master upscaled to 4K.Fact is most '4K' material you watch was edited/mastered in 2k because doing all of that in 4K/6K/8K is still extremely expensive and hard on computers. That includes Marvel movies, etc. They shoot on hi-res cameras but they edit using 2K 'DI' (digital intermediate) masters for theatrical release and which are then upscaled for 4k Blu-rays/select 4k theaters.
I don't know if this is relevant to this discussion but I just watched a DVD extra where the filmmaker was saying that 4K was much better than 1080HD because he could do a full scene shot but if he wanted to do a close up or eliminate some side unnecessary background or distration, they could do it in editing with a "close-up" without any loss of quality/graininess. So 4K (or better) at least is going to stay as storage gets cheaper.