article took forever to get to the bottom line. content. 8k content essentially does not exist. TV manufacturers were putting the cart before the horse.
4k tvs existed before the content existed. I think the larger issue is that the difference between what is and what could be is not worth the additional expense, especially at a time when most people struggle to pay rent, food, and medicine. More people watch videos on their phones than watch broadcast television. 8k is a solution looking for a problem.
Hell I still don’t own a 4k tv and don’t plan to go out of my way to buy one unless the need arises. Which I don’t see why I need that when a normal flat-screen looks fine to me.
I actually have some tube tvs and be thinking of just hooking my vcr back up and watching old tapes. I don’t need fancy resolutions in my shows or movies.
Only time I even think of those things is with video games.
4K hardly even makes sense unless your tv is over 70" and your watching it from less than 4 feet away. I do think VR could benefit from ultra-high resolution, though.
https://www.rtings.com/tv/reviews/by-size/size-to-distance-relationship
Extensive write up on this whole issue, even includes a calculator tool.
But, basically:
Yeah, going by angular resolution, even leaving the 8K content drought aside…
8K might make sense for a computer monitor you sit about 2 feet / 0.6m away from, if the diagonal size is 35 inches / ~89cm, or greater.
Take your viewing distance up to 8 feet / 2.4m away?
Your screen diagonal now has to be about 125 inches / ~318cm, or larger, for you to be able to maybe notice a difference with a jump from 4K to 8K.
…
The largest 8K TV that I can see available for purchase anywhere near myself… that costs ~$5,000 USD… is 85 inches.
I see a single one of 98 inches that is listed for $35,000. That’s the largest one I can see, but its… uh, wildly more expensive.
So with a $5,000, 85 inch TV, that works out to…
You would have to be sitting closer than about 5 feet / ~1.5 meters to notice a difference.
And that’s assuming you have 20/20 vision.
…
So yeah, VR goggle displays… seem to me to be the only really possibly practical use case for 8K … other than basically being the kind of person who owns a home with a dedicated theater room.
What this chart is missing is the impact of the quality of the screen and the source material being played on it.
A shit screen is a shit screen, just like a badly filmed TV show from the 80s will look like crap on anything other than an old CRT.
People buying a 4k screen from Wallmart for $200 then wondering why they cant tell its any better than their old 1080p screen.
The problem with pushing up resolution is the cost to get a good set right now is so much its a niche within a niche of people who actually want it. Even a good 4k set with proper HDR support and big enough to make a different is expensive. Even when 8k moves away from early adopter markups its still going to be expensive, especially when compared to the tat you can by at the supermarket.
It is totally true that things are even more complex than just resolution, but that is why I linked the much more exhaustive write up.
Its even more complicated in practice than all the things they bring up, they are focusing on mainly a movie watching experience, not a video game playing experience.
They do not go into LED vs QLED vs OLED vs other actual display techs, don’t go into response latency times, refresh rates, as you say all the different kinds of HDR color gamut support… I am sure I am forgetting things…
Power consumption may be a significant thing for you, image quality at various viewing angles…
Oh right, FreeSync vs GSync, VRR… blargh there are so many fucking things that can be different about displays…
You’re describing my bedroom tv.
you’re*
It’s not hard, get it right.
Nobody likes a grammar-nazi. Due better mein fuhrer.
gidoombiigiz*
I think it’s NHK, or one of the Japanese broadcasters anyways, that has actually been pressing for 8K since the 1990s. They didn’t have content back then and I doubt they have much today, but that’s what they wanted HD to be.
Not familiar with NHK specifically (or, to be clear, I think I am but not with enough certainty), but it really makes a lot of sense for news networks to push for 8k or even 16k at this point.
Because it is a chicken and egg thing. Nobody is going to buy an 8k TV if all the things they watch are 1440p. But, similarly, there aren’t going to be widespread 8k releases if everyone is watching on 1440p screens and so forth.
But what that ALSO means is that there is no reason to justify using 8k cameras if the best you can hope for is a premium 4k stream of a sporting event. And news outlets are fairly regularly the only source of video evidence of literally historic events.
From a much more banal perspective, it is why there is a gap in TV/film where you go from 1080p or even 4k re-releases to increasingly shady upscaling of 720 or even 480 content back to everything being natively 4k. Over simplifying, it is because we were using MUCH higher quality cameras than we really should have been for so long before switching to cheaper film and outright digital sensors because “there is no point”. Obviously this ALSO is dependent on saving the high resolution originals but… yeah.
it’s not exactly “there is no point”. It’s more like “the incremental benefit of filming and broadcasting in 8k does jot justify the large cost difference”.
Filming in 8k does have advantages. You can crop without losing quality.
I’m sorry, but if we are talking about 8k viability in TVs, we are not talking about shooting in 8k for 4k delivery.
You should be pointing out that shooting in higher than 8k, so you have the freedom to crop in post, is part of the reason 8k is burdensome and expensive.
So correct the person above me, they wrote about shooting in 8k.
The RED V-Raptor is expensive for consumer grade but nothing compared to some film equipment. There are lenses more expensive than an 8k camera.
Which, for all intents and purposes, means there is no point. Because no news network is going to respond to “Hey boss, I want us to buy a bunch of really expensive cameras that our audience will never notice because it will make our tape library more valuable. Oh, not to sell, but to donate to museums.” with anything other than laughter and MAYBE firing your ass.
the point is, the cost/benefit calculation will change over time as the price of everything goes down. It’s not a forever “no point”.
… Almost like it would be more viable to film in higher resolution if more consumers had higher resolution displays?
Not only the content doesn’t exist yet, it’s just not practical. Even now 4k broadcasting is rare and 4k streaming is now a premium (and not always with a good bitstream, which matters a lot more) when once was offered as a cost-free future, imagine 8k that would roughly quadruple the amount of data required to transmit it (and transmit speee is not linear, 4x the speed would probably be at least 8x the cost).
And I seriously think noone except the nerdiest of nerds would notice a difference between 4k and 8k.
That’s usually the case
Not only does it not exist, it isn’t wanted. People are content watching videos on YouTube and Netflix. They don’t care for 4k. Even if they pay extra for Netflix 4k (which I highly doubt they do) I still question if they are watching 4k with their bandwidth and other limiting factors, which means they’re not watching 4k and are fine with it.
TV manufacturers are idiots.
deleted by creator
I blacklist the TVs Ethernet and WiFi MAC addresses. I strongly encourage using a computer, Apple TV, or anything that can’t fingerprint everything you use your tv for.
This. I’ll happily buy an 8k TV only if it’s a dumb TV/monitor.
No, I want only one DP port and to have a separate box that selects sources. That way I have the ports I want
I don’t want 8K. I want my current 4K streaming to have less pixilation. I want my sound to be less compressed. Make them closer to Ultra BluRay disc quality before forcing 8K down our throats… unless doing that gives us better 4K overall.
Yeah 4K means jack if it’s compressed to hell, if you end up with pixels being repeated 4x to save on storage and bandwidth, you’ve effectively just recreated 1080p without upscaling.
Just like internet. I’d rather have guaranteed latency than 5Gbps.
Yep, just imagine how bad the compression artefacts will be if they double the resolution but keep storage/network costs the same.
Doubling the dimensions make it 4x the data.
Not if you only double it in one direction. Checkmate.
That’s not true for compressed video. It doubles the bitrate for the same quality on modern codecs (265, av1, etc.)
Increasing resolution but keeping the same bitrate still improves the image quality, unless the bitrate was extremely low in the first place. Especially with modern codecs
20mbps 4k looks a lot better than 20mbps 1080p with AV1
Bingo, if I were still collecting DVDs/HD DVDs like I was in the 90’s, it might be an issue. Streaming services and other online media routed through the TV can hardly buffer to keep up with play speed at 720, so what the fuck would I want with a TV that can show a higher quality of picture which it can also not display without stutter-buffering the whole of a 1:30:00 movie?
I do want a dumb 8K TV. I do not want all the so called smart features of a TV. Small Linux device with kodi works way better.
deleted by creator
Some Xiaomi TVs have root exploits, so you can manually disinfect the OS, but it’s cumbersome to get done since you need to enter adb commands over the remote control to get there in the first place.
Easier to just use an external device and the TV as a screen only. Personally I’m using the Nvidia Shield for 5+ years now and regret nothing.
Not ideal, but you can air gap the TV from the network, and use some small sbc, or even a firestick or android box. That’s what I do. Stremio?
As far as my TV is concerned I don’t have an internet connection.
I do want a TV that can access Netflix etc without another box. I just don’t want the surveillance that comes with it.
I personally hate Kodi UI. But I get your point
uh…there are hundreds of Kodi UIs.
I just run mine without ever connecting it to the internet.
I run an Apple TV (shock, walled garden!), as it is the only device I’ve seen that consistently matches frame rates properly on the output.
I am a filmmaker and have shot in 6k+ resolution since 2018. The extra pixels are great for the filmmaking side. Pixel binning when stepping down resolutions allows for better noise, color reproduction, sharpened details, and great for re-framing/cropping. 99% of my clients want their stuff in 1080p still! I barely even feel the urge to jump up to 4k unless the quality of the project somehow justifies it. Images have gotten to a good place. Detail won’t provide much more for human enjoyment. I hope they continue to focus on dynamic range, HDR, color accuracy, motion clarity, efficiency, etc. I won’t say no when we step up to 8k as an industry but computing as a whole is not close yet.
The same argument goes for audio too.
6K and 8K is great for editing, just like how 96 KHz 32+ bit and above is great for editing. But it’s meaningless for watching and listening (especially for audio, you can’t hear the difference above 44khz 16 bit). When editing you’ll often stack up small artifacts, which can be audible or visible if editing at the final resolution but easy to smooth over if you’re editing at higher resolutions.
Imagine you’re finishing in 8k, so you want to shoot higher resolution to give yourself some options in reframing and cropping? I don’t think Red, Arri, or Panavision even makes a cinema camera with a resolution over 8k. I think Arri is still 4k max. You’d pretty much be limited to Blackmagic cameras for 12k production today.
Plus the storage requirements for keeping raw footage in redundancy. Easy enough for a studio, but we’re YEARS from 8k being a practical resolution for most filmmakers.
My guess is most of the early consumer 8k content will be really shoddy AI upscaled content that can be rushed to market from film scans.
film scanning at 4k res already reveals the granular structure of film, at 8k it’s going to become hard to ignore. And you’re spot on - they’ll do crappy 8k upres garbage for ages before the storage and streaming become practical.
There is also a 17k blackmagic coming out! The high resolution sensors they use aren’t a standard RGB pixel layout though so it’s not a great direct comparison. Like you said though, there’s no pipeline or good workflow for 8k in the slightest. Will take years if the industry decides to push for it
The extra pixels are great for the filmmaking side.
I would much rather have 1080p content at a high enough bitrate that compression artifacts are not noticeable.
Yeah, as long as they don’t discontinue them.
Here in Australia, they are almost gone. Disney doesn’t release anymore and other studios only release the biggest of titles, smaller movies get less and less releases. Some TV shows only get DVD. Its got me importing discs for things I really want and importing a lot of stuff from the high seas
Even if they discontinue Bluray, 4k content isn’t going anywhere or 4k TVs do too.
The 4k you find on streaming services can’t really be compared to the 4k you find on Blu-ray. It’s a different league. Turns out bitrate actually matters
For what content? Video gaming (GPUs) has barely gotten to 4k. Movies? 4k streaming is a joke; better off with 1080 BD. If you care about quality go physical… UHD BD is hard to find and you have to wait and hunt to get them at reasonable prices… And these days there are only a couple UHD BD Player mfg left.
It’s such a shame that UHD isn’t easier to find. Even the ones you can find are poorly mastered half the time. But a good UHD on an OLED is chef’s kiss just about the closest you can get to having a 35mm reel/projector at home.
You are absolutely on point with 4k streaming being a joke. Most 4k streams are 8-20 Mbps. A UHD runs at 128 Mbps.
Most 4k streams are 8-20 Mbps. A UHD runs at 128 Mbps.
Bitrate is only one variable in overall perceived quality. There are all sorts of tricks that can significantly reduce file size (and thus bitrate of a stream) without a perceptible loss of quality. And somewhat counterintuitively, the compression tricks work a lot better on higher resolution source video, which is why each quadrupling in pixels (doubling height and width) doesn’t quadruple file size.
The codec matters (h.264 vs h.265/HEVC vs VP9 vs AV1), and so do the settings actually used to encode. Netflix famously is willing to spend a lot more computational power on encoding, because they have a relatively small number of videos and many, many users watching the same videos. In contrast, YouTube and Facebook don’t even bother re-encoding into a more efficient codec like AV1 until a video gets enough views that they think they can make up the cost of additional processing with the savings of lower bandwidth.
Video encoding is a very complex topic, and simple bitrate comparisons only barely scratch the surface in perceived quality.
It’s because for the Average Joe, having a TV box at the end of your driveway that has the latest big number on it is important. It’s how they gain their identity. Do not upset them for obvious reasons.
I don’t know if it changed, but when I started looking around to replace my set about 2 years ago, it was a nightmare of marketing "gotcha"s.
Some TVs were advertising 240fps, but only had 60fps panels with special tricks to double framerate twice or something silly. Other TVs offered 120fps, but only on one HDMI port. More TVs wouldn’t work without internet. Even more had shoddy UIs that were confusing to navigate and did stuff like default to their own proprietary software showing Fox News on every boot (Samsung). I gave up when I found out that most of them had abysmal latency since they all had crappy software running that messed with color values for no reason. So I just went and bought the cheapest TV at a bargain overstock store. Days of shopping time wasted, and a customer lost.
If I were shown something that advertised with 8K at that point, I’d have laughed and said it was obviously a marketing lie like everything else I encountered.
Asus makes their version of a 4k OLED LG panel with no shitty ‘smart’ software.
in that situation, Asus are the shitty part, though it is nice to see more TV-sized monitors. Fuck HDMI.
Did I miss something with Asus recently? I’ve only had good experiences with their hardware.
ASUS used to be the goat brand. They have since enshittified, and the biggest hit was their customer service. It’s 100% ass now. The product itself is really hit or miss now too.
I’ll consider you lucky. I’ve had many experiences with their hardware across different segments (phones, tablets, laptops, mainboards, NICs, displays, GPUs).
They’re an atrocious vendor with extremely poor customer support (and shitty SW practicies for UMA systems and motherboards).
I don’t think many people have been as unfortunate as I have with them, the general consensus is they mark their products up considerably relative to competition (particularly mainboards & GPUs).
To be fair, their contemporaries arent much butter.
Dang.
I switched to ASRock for my AMD build for specific feature sets and reading ASUS AM5 stuff it looks like that was a good idea.
But ASRock 800 series AM5 boards are killing granite ridge 3D CPUs en masse. Funny enough, it happened to me.
I begrudgingly switched to Asus after my CPU was RMA’d as that was the only other vendor to offer ECC compat on a consumer platform.
How about 7800X3D?
As someone who stupidly spent the last 20 or so years chasing the bleeding edge of TVs and A/V equipment, GOOD.
High end A/V is an absolute shitshow. No matter how much you spend on a TV, receiver, or projector, it will always have some stupid gotcha, terrible software, ad-laden interface, HDMI handshaking issue, HDR color problem, HFR sync problem or CEC fight. Every new standard (HDR10 vs HDR10+, Dolby Vision vs Dolby Vision 2) inherently comes with its own set of problems and issues and its own set of “time to get a new HDMI cable that looks exactly like the old one but works differently, if it works as advertised at all”.
I miss the 90s when the answer was “buy big chonky square CRT, plug in with component cables, be happy”.
Now you can buy a $15,000 4k VRR/HFR HDR TV, an $8,000 4k VRR/HFR/HDR receiver, and still somehow have them fight with each other all the fucking time and never work.
8K was a solution in search of a problem. Even when I was 20 and still had good eyesight, sitting 6 inches from a 90 inch TV I’m certain the difference between 4k and 8k would be barely noticeable.
I haven’t seen this mentioned but apart from 8K being expensive, requiring new production pipelines, unweildley for storage and bandwidth, unneeded, and not fixing g existing problems with 4K, it requires MASSIVE screens to reap benefits.
There are several similar posts, but suffice to say, 8K content is only perceived by average eyesight at living room distances when screens are OVER 100 inches in diameter at the bare minimum. That’s 7 feet wide.
Source: https://www.rtings.com/tv/reviews/by-size/size-to-distance-relationship
Tell me Legolas, what do your elven eyes see?
Fucking pixels Aragorn, it makes me want to puke. And what the fuck is up with these compression artifacts? What tier of Netflix do you have?
Sorry Legolas, could we just enjoy the movie?
Maybe if the dwarf stops stinking up the place. And don’t think I didn’t see him take that last chicken wing, fucking dwarves.
Not sure where 1440p would land, but after using one for a while, I was going to upgrade my monitor to 4k but realized I’m not disappointed with my current resolution at all and instead opted for a 1440p ultrawide and haven’t regretted it at all.
My TV is 4k, but I have no intention of even seriously looking at anything 8k.
Screen specs seem like a mostly solved problem. Would be great if focus could shift to efficiency improvements instead of adding more unnecessary power. Actually, boot time could be way better, too (ie get rid of the smart shit running on a weak processor, emphasis on the first part).
8K would only work well in a movie theater type setting I guess.
4k 25" was worth it for me but I only spent about £140 on it so YMMV it’s nice but not essential and after 1080p the extra pixels only add so much
I watch torrented shows with VLC on my laptop. Why would I want a giant smarphone that spies on me?
So many things have reached not only diminishing returns, but no returns whatsoever. I don’t have a single problem that more technology will solve.
I just don’t care about any of this technical shit anymore. I only have two eyes, and there’s only 24 hours in a day. I already have enough entertainment in perfectly acceptable quality, with my nearly 15 year old setup.
I’ve tapped out from the tech scene.
I’ve hit that same wall. I’m perfectly happy with a $300 smartphone, because it does absolutely everything I need to do, fast enough to not make me want to throw it across the room, and well enough that I don’t notice the difference between it and a high-end device.
Do I notice the difference after three or four years of having the device and finally upgrading it to a new device in that price range? Sure, I notice it. But day to day use, I don’t notice it and that’s what matters.
I don’t understand most of the things I used to enjoy as a kid. I went from radio to cassette to CD to MiniDisc to MP3s. Now I’m supposed to endlessly change things around to keep up with media players and codecs and whatevers. No thanks.
I used to enjoy programming and tinkering with computers and microcontrollers.
Now I have to be an expert in 15 unrelated fields and softwares because even a simple job of turning a button press into a single output pulse is a weeks-long nightmare of IDEs and OSes and embedded Linuxes and 32 bit microcontrollers and environments, none of which are clear and straightforward, and all have subtle inter-dependencies.
So to turn on a LED with a switch now requires a multi-core 16GB main PC (so limited! You need more!) so I can open a multi-GB IDE (that can support every language ever invented) that requires an SSD just to be able to navigate the 35 windows it opens in less than an hour, so I can use AI to copy-paste hundreds of lines of boiler plate code I don’t understand, so I can type a few lines of code?
And that’s not counting all the new companies and architectures.
Most Americans are out of money and can’t find good jobs. We are clinging to our old TVs and cars and computers and etc. for dear life, as we hope for better days.
And what can you even watch in true 8K right now? Some YouTube videos?
But but but, don’t you want better hardware so we can read your brain waves to automatically show you something you’re in the mood to watch while we save that info and sell it to someone who wants to control your nervous system later?
Calm down there Edward Nigma.
The difference between 1080 and 4K is pretty visible, but the difference between 4K and 8K, especially from across a room, is so negligible that it might as well be placebo.
Also the fact that 8K content takes up a fuckload more storage space. So, there’s that, too.
solution: 16K 3D TV. buy now.
I only want the curved IMAX version, though
I just want a projector at that point, seems we are destined to bounce around aspect ratios and that might help the black bar situation.
if it’s a 16’ x 9’ screen, I’m in.
Even 1080p isn’t hugely different from 4k in many cases. Yeah, you can probably notice it, but both are fantastic resolutions. I’ve had a 4k TV for years, and I can count the number of times I’ve actually watched 4k content on it on two hands because it generally isn’t worth the storage space or extra cost.
I find that it really depends on the content on the size of the display.
The larger the display, the more you’d benefit from having a higher resolution.
For instance, a good quality 1080p stream vs a highly compressed 4k stream probably won’t look much different. But a “raw” 4k stream looks incredible… think of the demos you see in stores showing off 4k TVs… that quality is noticeable.
Put the same content on a 50"+ screen, and you’ll see the difference.
When I had Netflix, watching in 4k was great, but to me, having HDR is “better”.
On a computer monitor, there’s a case for high-resolution displays because they allow you to fit more on the screen without making the content look blurry. But on a TV, 4k + HDR is pretty much peak viewing for most people.
That’s not to say that if you create content, 8k is useless. It can be really handy when cropping or re-framing if needed, assuming the desired output is less than 8k.
think of the demos you see in stores showing off 4k TVs… that quality is noticeable.
Sure. But remember that much of the time, the content is tuned for what the display is good at, which won’t necessarily reflect what you want to watch on it (i.e. they’re often bright colors with frequent color changes, whereas many movies are dark with many slow parts). At least at the start, many 4k TVs had a worse picture than higher end 1080p TVs, and that’s before HDR was really a thing.
So yeah, it highly depends on the content. As you mentioned, in many cases, 1080p HDR will be better than 4k non-HDR. Obviously 4k HDR on a good display is better than 1080p HDR on a good display, but the difference is much less than many people claim it to be, especially at a typical TV viewing distance (in our case, 10-15 ft/3-5m).
computer monitor
I find the sweet spot to be 1440p. 4k is nicer, but the improvement over 1440p is much less than 1440p vs 1080p. My desktop monitor is a 27" 1440p monitor w/ approx 109 ppi, and my work laptop is a Macbook Pro w/ 3024x1964 resolution w/ approx 254 ppi, more than double. And honestly, they’re comparable. Text and whatnot is certainly sharper on the nicer display, but there are certainly diminishing returns.
That said, if I were to watch movies frequently on my computer, I’d prefer a larger 4k monitor so 1080p content upscales better. But for games and normal computer stuff, 1440p is plenty.
Given that I don’t find a ton of value in 4k over 1080p, 8k will be even more underwhelming.
think of the demos you see in stores showing off 4k TVs… that quality is noticeable.
Because stores use a high quality feed and force you to stand withing 4ft of the display. There is a whole science to how Best Buy manipulates TV sales. They will not let you adjust TV picture settings on lower margin TVs.
Because stores use a high quality feed
Yes, obviously, and consumers who are buying such high-end displays should do their best to provide the highest quality source to play back on those displays.
Distance from the display is important, too. On a small TV, you’ll be close to it, but resolution won’t matter as much.
But from across the room, you want a higher resolution display up to a certain point, or else you’ll see large pixels, and that looks terrible.
Personally, going with a 4k TV was a big leap, but the addition of HDR and an OLED display (for black blacks) had the most impact.
depends how far you are from the screen.
True. Our TV is 10-15 ft/3-5m away on a ~60in screen, and at that distance, the difference is noticable, but not significant. We have a 40" screen with much closer viewing distance (usually 5-8 ft/~2m), and we definitely notice the difference there.
If I was watching movies at a desk w/ a computer monitor, I’d certainly notice 1080p vs 4k, provided the screen is large enough. In our living room with the couch much further from the screen, the difference is much less important.
It creates more problems than it solves. You would need an order of magnitude more processing power to play a game on it. Personally I would prefer 4K at a higher framerate. Even 1080 if it improves response.
Video in 8K are massive. You need better codecs to handle them, and they aren’t that widely supported. Storage is more expensive than it was a decade ago.
Also, there is no content. Nobody wants to store and transmit such massive amounts of data over the internet.
HDMI cables will fail sooner at higher resolutions. That 5 year old cable will begin dropping out when you try it at 8k.
4K is barely worth the tradeoffs.
A couple things - every jump like that in resolution is about a 10% increase in size at the source level. So 2K is ~250GB, 4K is ~275GB. Haven’t had to deal with 8K myself, yet, but it would be at ~300GB. And then you compress all that for placea like netflix and the size goes down drastically. Add to that codec improvements over time (like x264 -> x265) and you might actually end up with an identical size compressed while carrying 4x more pixels.
HDMI is digital. It doesn’t start failing because of increased bandwidth; there’s nothing consumable. It either works or it doesn’t.
Yeah, legitimate 8K use cases are ridiculously niche, and I mean… really only have value if you’re talking about an utterly massive display, probably around 90 inches or larger, and even then in a pretty small room.
The best use cases I can think of are for games where you’re already using DLSS, and can just upscale from the same source resolution to 8K rather than 4K? Maybe something like an advanced CRT filter that can better emulate a real CRT with more resolution to work with, where a pixel art game leaves you with lots of headroom for that effect? Maybe there’s value in something like an emulated split screen game, to effectively give 4 players their own 4K TV in an N64 game or something?
But uh… yeah, all use cases that are far from the average consumer. Most people I talk to don’t even really appreciate 1080p->4K, and 4X-ing your resolution again is a massive processing power ask in a world where you can’t just… throw together multiple GPUs in SLI or something. Even if money is no object, 8K in mainline gaming will require some ugly tradeoffs for the next several years, and probably even forever if devs keep pushing visuals and targeting upscaled 4K 30/60 on the latest consoles.
4K for me as a developer means that I can have a couple of source files and a browser with the API documentation open at the same time. I reckon I could use legitimately use an 8K screen - get a terminal window or two open as well, keep an eye on builds and deployments while I’m working on a ticket.
Now yes - gaming and watching video at 8K. That’s phenomenally niche, and very much a case of diminishing returns. But some of us have to work for a living as well, alas, and would like them pixels.
Even as a dev, I use a 32" QHD screen for programming. If I went 4K, I would need to use 150% scaling, and that breaks a LOT of stuff.
Everything is built for 100% scaling. Every time I’ve plugged my PC into a 4K display I’ve regretted it. It go to 30Hz (on HDMI) or glitch out or something. Even if it doesn’t, it’s never as smooth.
I have a 43" 4K and at that physical size display scaling at 100% is appropriate (despite windows trying to run it at 300% out of the box) and it is legitimately useful. Its effectively four 1080p screens in a grid with no bezel between.
Good point, 4K text for programming is pretty fantastic, if you don’t mind small text and use a big monitor, I could see 8K bringing some worthwhile clarity improvements to some productivity workflows. It’s probably better for monitors than it is for TVs.