Looked at the CES reveals and aside from some minor improvements, its nothing but overloaded AI crap.
Even on TVs from 10 years ago, the first thing you had to do was turn off the stupid auto frame generation, smoothing, lighting, and other effects so you can actually enjoy your content in original detail and correct FPS.
It took me way too long to figure out what was going on with those settings. One of my relatives tv's was like this back in the day and at first I thought it was just their "HD" setup which made me completely write off getting anything HD because of the fake look like a soap opera. It wasn't till I was gifted a blue-ray player that I realized their tv just had horrible "enchancement" shit.
A friend is buying a TV or a screen for console gaming anyway and man, the TV's are actually pretty decent for gaming nowadays. I haven't checked out any for several years.
I bought a UHD LED tv in like 2016 and what a POS it is compared to these modern models. I mean I haven't had it for years gave it to my sister but still.
I thought they looked pretty damn nifty. And AI isn't a curse word when it comes to everything. I get being annoyed at the marketing, I am too, but, like isn't Nvidia DLSS AI? That's shit's actually good.
Sadly it also enables studios to cheap out on optimization, you shouldn't need upscaling for 1080p medium on a new GPU.
Well that is a food point in late stages capitalism.
I was idealistically thinking about it light might be beneficial for those 480hz and whatnot screens coming out.
And for these new Blackwells like for 5070 the vram is still only 12, but they claim they have a much better resolution compressing tech or something.
Idk man but to me just thinking everything AI is "ick" is sort of ludditic. Yeah it's a garbage overhyped marketing term but some of the features applications people are coming up for sophisticated neural networks are pretty godddamn cool.