I’m so content with 1080p
TV and movies I’m totally good with 1080p. If I want a cinematic experience, that’s what the cinema is for.
But since switching to PC and gaming in 4k everywhere I can, it feels like a night and day difference to play in 1080p. Granted that means I care about monitor resolution rather than TV resolution.
But as an aside, as a software engineer that works from home, crisp text, decent color spectrum support, good brightness in a bright room, all things that make your day a whole lot better when you stare at a computer screen for a large chunk of your day
I run a Plex server with really high quality 1080p and I’m completely satisfied with it. I don’t see a reason to use the extra storage on 4k
Na, 4K, even 1080p upscaled to 4K is significantly better thsn FullHD with a video projector.
I’m so content with 1080p
Me too. I don’t even need 60hz. I get motion sickness if a screen goes over 30hz. I guess I’m officially old.
I like 4k for documentaries and cinematic shows, but Ill never watch something like TNG or Jessica Jones on 4k again. Takes all the magic away, feels like you’re standing next to the camera guy - suddenly I just see an actor in room and the immersion is broken.
Sounds like you have motion smoothing on.
Resolution alone isn’t enough to fuck that up. I noticed it first when watching The Hobbit in cinemas at 48fps. It makes things that are real look very real, and unfortunately what was real was Martin Freeman wearing rubber feet.
unfortunately what was real was Martin Freeman wearing rubber feet.
🤣🤣🤣
Ok, good tip. I’ll try that out and see if I can enjoy it more.
I’ll take a pair of rubber feet too!
I’m content with 480. High quality isn’t important for me. I still listen to mp3’s that I got 25+ years ago.
Nothing is released in 8k so why would someone want something nothing is in?
Computer monitor with multiple simultaneous 4k displays?
Grasping at straws here
Nothing is produced in 8K either.
I am a filmmaker and have shot in 6k+ resolution since 2018. The extra pixels are great for the filmmaking side. Pixel binning when stepping down resolutions allows for better noise, color reproduction, sharpened details, and great for re-framing/cropping. 99% of my clients want their stuff in 1080p still! I barely even feel the urge to jump up to 4k unless the quality of the project somehow justifies it. Images have gotten to a good place. Detail won’t provide much more for human enjoyment. I hope they continue to focus on dynamic range, HDR, color accuracy, motion clarity, efficiency, etc. I won’t say no when we step up to 8k as an industry but computing as a whole is not close yet.
The same argument goes for audio too.
6K and 8K is great for editing, just like how 96 KHz 32+ bit and above is great for editing. But it’s meaningless for watching and listening (especially for audio, you can’t hear the difference above 44khz 16 bit). When editing you’ll often stack up small artifacts, which can be audible or visible if editing at the final resolution but easy to smooth over if you’re editing at higher resolutions.
Imagine you’re finishing in 8k, so you want to shoot higher resolution to give yourself some options in reframing and cropping? I don’t think Red, Arri, or Panavision even makes a cinema camera with a resolution over 8k. I think Arri is still 4k max. You’d pretty much be limited to Blackmagic cameras for 12k production today.
Plus the storage requirements for keeping raw footage in redundancy. Easy enough for a studio, but we’re YEARS from 8k being a practical resolution for most filmmakers.
My guess is most of the early consumer 8k content will be really shoddy AI upscaled content that can be rushed to market from film scans.
film scanning at 4k res already reveals the granular structure of film, at 8k it’s going to become hard to ignore. And you’re spot on - they’ll do crappy 8k upres garbage for ages before the storage and streaming become practical.
There is also a 17k blackmagic coming out! The high resolution sensors they use aren’t a standard RGB pixel layout though so it’s not a great direct comparison. Like you said though, there’s no pipeline or good workflow for 8k in the slightest. Will take years if the industry decides to push for it
The extra pixels are great for the filmmaking side.
For what content? Video gaming (GPUs) has barely gotten to 4k. Movies? 4k streaming is a joke; better off with 1080 BD. If you care about quality go physical… UHD BD is hard to find and you have to wait and hunt to get them at reasonable prices… And these days there are only a couple UHD BD Player mfg left.
It’s such a shame that UHD isn’t easier to find. Even the ones you can find are poorly mastered half the time. But a good UHD on an OLED is chef’s kiss just about the closest you can get to having a 35mm reel/projector at home.
You are absolutely on point with 4k streaming being a joke. Most 4k streams are 8-20 Mbps. A UHD runs at 128 Mbps.
Most 4k streams are 8-20 Mbps. A UHD runs at 128 Mbps.
Bitrate is only one variable in overall perceived quality. There are all sorts of tricks that can significantly reduce file size (and thus bitrate of a stream) without a perceptible loss of quality. And somewhat counterintuitively, the compression tricks work a lot better on higher resolution source video, which is why each quadrupling in pixels (doubling height and width) doesn’t quadruple file size.
The codec matters (h.264 vs h.265/HEVC vs VP9 vs AV1), and so do the settings actually used to encode. Netflix famously is willing to spend a lot more computational power on encoding, because they have a relatively small number of videos and many, many users watching the same videos. In contrast, YouTube and Facebook don’t even bother re-encoding into a more efficient codec like AV1 until a video gets enough views that they think they can make up the cost of additional processing with the savings of lower bandwidth.
Video encoding is a very complex topic, and simple bitrate comparisons only barely scratch the surface in perceived quality.
It’s because for the Average Joe, having a TV box at the end of your driveway that has the latest big number on it is important. It’s how they gain their identity. Do not upset them for obvious reasons.
As someone who stupidly spent the last 20 or so years chasing the bleeding edge of TVs and A/V equipment, GOOD.
High end A/V is an absolute shitshow. No matter how much you spend on a TV, receiver, or projector, it will always have some stupid gotcha, terrible software, ad-laden interface, HDMI handshaking issue, HDR color problem, HFR sync problem or CEC fight. Every new standard (HDR10 vs HDR10+, Dolby Vision vs Dolby Vision 2) inherently comes with its own set of problems and issues and its own set of “time to get a new HDMI cable that looks exactly like the old one but works differently, if it works as advertised at all”.
I miss the 90s when the answer was “buy big chonky square CRT, plug in with component cables, be happy”.
Now you can buy a $15,000 4k VRR/HFR HDR TV, an $8,000 4k VRR/HFR/HDR receiver, and still somehow have them fight with each other all the fucking time and never work.
8K was a solution in search of a problem. Even when I was 20 and still had good eyesight, sitting 6 inches from a 90 inch TV I’m certain the difference between 4k and 8k would be barely noticeable.
The consumer has spoken and they don’t care, not even for 4K. Same as happened with 3D and curved TVs, 8K is a solution looking for a problem so that more TVs get sold.
In terms of physical media - at stores in Australia the 4K section for Blurays takes up a single rack of shelves. Standard Blurays and DVDs take up about 20.
Even DVDs still sell well because many consumers don’t see a big difference in quality, and certainly not enough to justify the added cost of Bluray, let alone 4K editions. A current example, Superman is $20 on DVD, $30 on Bluray (50% cost increase) or $40 on 4K (100%) cost increase. Streaming services have similar pricing curves for increased fidelity.
It sucks for fans of high res, but it’s the reality of the market. 4K will be more popular in the future if and when it becomes cheaper, and until then nobody (figuratively) will give a hoot about 8K.
Some of the smaller 4k sets work as an XXL computer monitor
But for a living room tv, you seriously need space for a 120"+ set to actually see any benefit of 8k. Most people don’t even have the physical space for that
article took forever to get to the bottom line. content. 8k content essentially does not exist. TV manufacturers were putting the cart before the horse.
4k tvs existed before the content existed. I think the larger issue is that the difference between what is and what could be is not worth the additional expense, especially at a time when most people struggle to pay rent, food, and medicine. More people watch videos on their phones than watch broadcast television. 8k is a solution looking for a problem.
Hell I still don’t own a 4k tv and don’t plan to go out of my way to buy one unless the need arises. Which I don’t see why I need that when a normal flat-screen looks fine to me.
I actually have some tube tvs and be thinking of just hooking my vcr back up and watching old tapes. I don’t need fancy resolutions in my shows or movies.
Only time I even think of those things is with video games.
4K hardly even makes sense unless your tv is over 70" and your watching it from less than 4 feet away. I do think VR could benefit from ultra-high resolution, though.
https://www.rtings.com/tv/reviews/by-size/size-to-distance-relationship
Extensive write up on this whole issue, even includes a calculator tool.
But, basically:
Yeah, going by angular resolution, even leaving the 8K content drought aside…
8K might make sense for a computer monitor you sit about 2 feet / 0.6m away from, if the diagonal size is 35 inches / ~89cm, or greater.
Take your viewing distance up to 8 feet / 2.4m away?
Your screen diagonal now has to be about 125 inches / ~318cm, or larger, for you to be able to maybe notice a difference with a jump from 4K to 8K.
…
The largest 8K TV that I can see available for purchase anywhere near myself… that costs ~$5,000 USD… is 85 inches.
I see a single one of 98 inches that is listed for $35,000. That’s the largest one I can see, but its… uh, wildly more expensive.
So with a $5,000, 85 inch TV, that works out to…
You would have to be sitting closer than about 5 feet / ~1.5 meters to notice a difference.
And that’s assuming you have 20/20 vision.
…
So yeah, VR goggle displays… seem to me to be the only really possibly practical use case for 8K … other than basically being the kind of person who owns a home with a dedicated theater room.
What this chart is missing is the impact of the quality of the screen and the source material being played on it.
A shit screen is a shit screen, just like a badly filmed TV show from the 80s will look like crap on anything other than an old CRT.
People buying a 4k screen from Wallmart for $200 then wondering why they cant tell its any better than their old 1080p screen.
The problem with pushing up resolution is the cost to get a good set right now is so much its a niche within a niche of people who actually want it. Even a good 4k set with proper HDR support and big enough to make a different is expensive. Even when 8k moves away from early adopter markups its still going to be expensive, especially when compared to the tat you can by at the supermarket.
It is totally true that things are even more complex than just resolution, but that is why I linked the much more exhaustive write up.
Its even more complicated in practice than all the things they bring up, they are focusing on mainly a movie watching experience, not a video game playing experience.
They do not go into LED vs QLED vs OLED vs other actual display techs, don’t go into response latency times, refresh rates, as you say all the different kinds of HDR color gamut support… I am sure I am forgetting things…
Power consumption may be a significant thing for you, image quality at various viewing angles…
Oh right, FreeSync vs GSync, VRR… blargh there are so many fucking things that can be different about displays…
You’re describing my bedroom tv.
you’re*
It’s not hard, get it right.
Nobody likes a grammar-nazi. Due better mein fuhrer.
gidoombiigiz*
I think it’s NHK, or one of the Japanese broadcasters anyways, that has actually been pressing for 8K since the 1990s. They didn’t have content back then and I doubt they have much today, but that’s what they wanted HD to be.
Not familiar with NHK specifically (or, to be clear, I think I am but not with enough certainty), but it really makes a lot of sense for news networks to push for 8k or even 16k at this point.
Because it is a chicken and egg thing. Nobody is going to buy an 8k TV if all the things they watch are 1440p. But, similarly, there aren’t going to be widespread 8k releases if everyone is watching on 1440p screens and so forth.
But what that ALSO means is that there is no reason to justify using 8k cameras if the best you can hope for is a premium 4k stream of a sporting event. And news outlets are fairly regularly the only source of video evidence of literally historic events.
From a much more banal perspective, it is why there is a gap in TV/film where you go from 1080p or even 4k re-releases to increasingly shady upscaling of 720 or even 480 content back to everything being natively 4k. Over simplifying, it is because we were using MUCH higher quality cameras than we really should have been for so long before switching to cheaper film and outright digital sensors because “there is no point”. Obviously this ALSO is dependent on saving the high resolution originals but… yeah.
it’s not exactly “there is no point”. It’s more like “the incremental benefit of filming and broadcasting in 8k does jot justify the large cost difference”.
Filming in 8k does have advantages. You can crop without losing quality.
I’m sorry, but if we are talking about 8k viability in TVs, we are not talking about shooting in 8k for 4k delivery.
You should be pointing out that shooting in higher than 8k, so you have the freedom to crop in post, is part of the reason 8k is burdensome and expensive.
So correct the person above me, they wrote about shooting in 8k.
The RED V-Raptor is expensive for consumer grade but nothing compared to some film equipment. There are lenses more expensive than an 8k camera.
Which, for all intents and purposes, means there is no point. Because no news network is going to respond to “Hey boss, I want us to buy a bunch of really expensive cameras that our audience will never notice because it will make our tape library more valuable. Oh, not to sell, but to donate to museums.” with anything other than laughter and MAYBE firing your ass.
the point is, the cost/benefit calculation will change over time as the price of everything goes down. It’s not a forever “no point”.
… Almost like it would be more viable to film in higher resolution if more consumers had higher resolution displays?
Not only the content doesn’t exist yet, it’s just not practical. Even now 4k broadcasting is rare and 4k streaming is now a premium (and not always with a good bitstream, which matters a lot more) when once was offered as a cost-free future, imagine 8k that would roughly quadruple the amount of data required to transmit it (and transmit speee is not linear, 4x the speed would probably be at least 8x the cost).
And I seriously think noone except the nerdiest of nerds would notice a difference between 4k and 8k.
That’s usually the case
Not only does it not exist, it isn’t wanted. People are content watching videos on YouTube and Netflix. They don’t care for 4k. Even if they pay extra for Netflix 4k (which I highly doubt they do) I still question if they are watching 4k with their bandwidth and other limiting factors, which means they’re not watching 4k and are fine with it.
TV manufacturers are idiots.
I don’t care about 8k.
I just want an affordable dumb TV. No on-board apps whatsoever. No smart anything. No Ethernet port, no WiFi. I have my own stuff to plug into HDMI already.
I’m aware of commercial displays. It just sucks that I have to pay way more to have fewer features now.
I blacklist the TVs Ethernet and WiFi MAC addresses. I strongly encourage using a computer, Apple TV, or anything that can’t fingerprint everything you use your tv for.
This. I’ll happily buy an 8k TV only if it’s a dumb TV/monitor.
No, I want only one DP port and to have a separate box that selects sources. That way I have the ports I want
I do want a dumb 8K TV. I do not want all the so called smart features of a TV. Small Linux device with kodi works way better.
deleted by creator
Some Xiaomi TVs have root exploits, so you can manually disinfect the OS, but it’s cumbersome to get done since you need to enter adb commands over the remote control to get there in the first place.
Easier to just use an external device and the TV as a screen only. Personally I’m using the Nvidia Shield for 5+ years now and regret nothing.
Not ideal, but you can air gap the TV from the network, and use some small sbc, or even a firestick or android box. That’s what I do. Stremio?
As far as my TV is concerned I don’t have an internet connection.
I do want a TV that can access Netflix etc without another box. I just don’t want the surveillance that comes with it.
I just run mine without ever connecting it to the internet.
I run an Apple TV (shock, walled garden!), as it is the only device I’ve seen that consistently matches frame rates properly on the output.I personally hate Kodi UI. But I get your point
uh…there are hundreds of Kodi UIs.
The difference between 1080 and 4K is pretty visible, but the difference between 4K and 8K, especially from across a room, is so negligible that it might as well be placebo.
Also the fact that 8K content takes up a fuckload more storage space. So, there’s that, too.
solution: 16K 3D TV. buy now.
I only want the curved IMAX version, though
I just want a projector at that point, seems we are destined to bounce around aspect ratios and that might help the black bar situation.
if it’s a 16’ x 9’ screen, I’m in.
Even 1080p isn’t hugely different from 4k in many cases. Yeah, you can probably notice it, but both are fantastic resolutions. I’ve had a 4k TV for years, and I can count the number of times I’ve actually watched 4k content on it on two hands because it generally isn’t worth the storage space or extra cost.
I find that it really depends on the content on the size of the display.
The larger the display, the more you’d benefit from having a higher resolution.
For instance, a good quality 1080p stream vs a highly compressed 4k stream probably won’t look much different. But a “raw” 4k stream looks incredible… think of the demos you see in stores showing off 4k TVs… that quality is noticeable.
Put the same content on a 50"+ screen, and you’ll see the difference.
When I had Netflix, watching in 4k was great, but to me, having HDR is “better”.
On a computer monitor, there’s a case for high-resolution displays because they allow you to fit more on the screen without making the content look blurry. But on a TV, 4k + HDR is pretty much peak viewing for most people.
That’s not to say that if you create content, 8k is useless. It can be really handy when cropping or re-framing if needed, assuming the desired output is less than 8k.
think of the demos you see in stores showing off 4k TVs… that quality is noticeable.
Sure. But remember that much of the time, the content is tuned for what the display is good at, which won’t necessarily reflect what you want to watch on it (i.e. they’re often bright colors with frequent color changes, whereas many movies are dark with many slow parts). At least at the start, many 4k TVs had a worse picture than higher end 1080p TVs, and that’s before HDR was really a thing.
So yeah, it highly depends on the content. As you mentioned, in many cases, 1080p HDR will be better than 4k non-HDR. Obviously 4k HDR on a good display is better than 1080p HDR on a good display, but the difference is much less than many people claim it to be, especially at a typical TV viewing distance (in our case, 10-15 ft/3-5m).
computer monitor
I find the sweet spot to be 1440p. 4k is nicer, but the improvement over 1440p is much less than 1440p vs 1080p. My desktop monitor is a 27" 1440p monitor w/ approx 109 ppi, and my work laptop is a Macbook Pro w/ 3024x1964 resolution w/ approx 254 ppi, more than double. And honestly, they’re comparable. Text and whatnot is certainly sharper on the nicer display, but there are certainly diminishing returns.
That said, if I were to watch movies frequently on my computer, I’d prefer a larger 4k monitor so 1080p content upscales better. But for games and normal computer stuff, 1440p is plenty.
Given that I don’t find a ton of value in 4k over 1080p, 8k will be even more underwhelming.
think of the demos you see in stores showing off 4k TVs… that quality is noticeable.
Because stores use a high quality feed and force you to stand withing 4ft of the display. There is a whole science to how Best Buy manipulates TV sales. They will not let you adjust TV picture settings on lower margin TVs.
Because stores use a high quality feed
Yes, obviously, and consumers who are buying such high-end displays should do their best to provide the highest quality source to play back on those displays.
Distance from the display is important, too. On a small TV, you’ll be close to it, but resolution won’t matter as much.
But from across the room, you want a higher resolution display up to a certain point, or else you’ll see large pixels, and that looks terrible.
Personally, going with a 4k TV was a big leap, but the addition of HDR and an OLED display (for black blacks) had the most impact.
depends how far you are from the screen.
True. Our TV is 10-15 ft/3-5m away on a ~60in screen, and at that distance, the difference is noticable, but not significant. We have a 40" screen with much closer viewing distance (usually 5-8 ft/~2m), and we definitely notice the difference there.
If I was watching movies at a desk w/ a computer monitor, I’d certainly notice 1080p vs 4k, provided the screen is large enough. In our living room with the couch much further from the screen, the difference is much less important.
I don’t want 8K. I want my current 4K streaming to have less pixilation. I want my sound to be less compressed. Make them closer to Ultra BluRay disc quality before forcing 8K down our throats… unless doing that gives us better 4K overall.
Yeah 4K means jack if it’s compressed to hell, if you end up with pixels being repeated 4x to save on storage and bandwidth, you’ve effectively just recreated 1080p without upscaling.
Just like internet. I’d rather have guaranteed latency than 5Gbps.
Yep, just imagine how bad the compression artefacts will be if they double the resolution but keep storage/network costs the same.
Doubling the dimensions make it 4x the data.
That’s not true for compressed video. It doubles the bitrate for the same quality on modern codecs (265, av1, etc.)
Not if you only double it in one direction. Checkmate.
Increasing resolution but keeping the same bitrate still improves the image quality, unless the bitrate was extremely low in the first place. Especially with modern codecs
20mbps 4k looks a lot better than 20mbps 1080p with AV1
Bingo, if I were still collecting DVDs/HD DVDs like I was in the 90’s, it might be an issue. Streaming services and other online media routed through the TV can hardly buffer to keep up with play speed at 720, so what the fuck would I want with a TV that can show a higher quality of picture which it can also not display without stutter-buffering the whole of a 1:30:00 movie?
I would much rather have 1080p content at a high enough bitrate that compression artifacts are not noticeable.
Yeah, as long as they don’t discontinue them.
Here in Australia, they are almost gone. Disney doesn’t release anymore and other studios only release the biggest of titles, smaller movies get less and less releases. Some TV shows only get DVD. Its got me importing discs for things I really want and importing a lot of stuff from the high seas
Even if they discontinue Bluray, 4k content isn’t going anywhere or 4k TVs do too.
The 4k you find on streaming services can’t really be compared to the 4k you find on Blu-ray. It’s a different league. Turns out bitrate actually matters
Pretty sure my eyes max out at 4K. I can barely tell the difference between 4K and 1080P from my couch.
Try BD vs UHD BD on a modern movie. No Country for Old Men for example. Hugely noticeable.
Yeah. Another one for me was Deadpool, because the texture of his outfit actually feels real on the 4K disc in a way that it doesn’t in HD.
Whenever I see people point at math equations “proving” that it’s impossible to tell the difference from a comfortable viewing distance, I think of Deadpool’s contours.
Can I identify the individual pixels in HD? Nope. Does it make a difference? Yes definitely.
I’ll add that to the list. Thanks!
If you can’t notice it when you’re not comparing side by side it doesn’t count
I never said side by side. The UHD is noticeable without that.
HDR is more noticeable, but yeah, I don’t care if it’s 1080p or 4k.
I watch torrented shows with VLC on my laptop. Why would I want a giant smarphone that spies on me?
I hate the wording of the headline, because it makes it sound like the consumers’ fault that the industry isn’t delivering on something they promised. It’s like marketing a fusion-powered sex robot that’s missing the power core, and turning around and saying “nobody wants fusion-powered sex robots”.
Side note, I’d like for people to stop insisting that 60fps looks “cheap”, so that we can start getting good 60fps content. Heck, at this stage I’d be willing to compromise at 48fps if it gets more directors on board. We’ve got the camera sensor technology in 2025 for this to work in the same lighting that we used to need for 24fps, so that excuse has flown.
The only complaints I’ve ever heard about 60fps are from gamers who prefer higher refresh rates. Does anyone advocate for framerates to be lower than 60??
Yes, movie people complain that more than 24 fps looks like soap operas (because digital TV studio cameras moved to 60 fps first).
Yeah, also as I alluded to earlier if you shoot at 60fps you get a shorter max exposure time per frame, which can translate to needing more light, which in turn leads to the studio lighting soap opera feel. But that was more of a limitation 15 years ago than it is now.