Edit: I guess this post is free advertising for these shitters, so I will describe what I previously linked.
There is this TV you can get for free but it has a ads on screen constantly and it has a camera pointed at you to record your reactions to ads and to ensure you don’t cover up the ad portion of the screen.
Exactly, what’s the use of a smart TV when I have a game console capable of streaming everything a “Smart TV” can AND playing games/browsing the Web?
9/10 times people use a fire stick or cable box to watch TV anyway, all I need is volume, input selection, and power.
You need that “smart” stuff to make smaller resolutions look good on a 4k TV.
Upscaling takes processing power, so adding in the smart functions doesn’t really need anything else.
Without the upscaling, it’s often a waste to buy a 4k TV because very few sources are in 4k. You’d be getting the same picture quality as 1080
I have a 4K dumb TV with no Wi-Fi at all and I’ve never had any image issues.
What does wifi have to do with anything?
And if it upscaled well, it’s already got the processing power to have all the apps installed. So, cool it doesn’t have that, but it’s kind of a waste you don’t… And seems a lot more likely you’re overestimating it.
But I have a feeling if you look up your TV on rtings you’ll be surprised, or find out you bought a weird Black Friday model that’s been feature stripped to be as cheap and low quality as possible while marking a couple highlights to advertise on the box.
I think you have a fundamental misunderstanding of what a “smart TV” is
I think people don’t understand that the processing abilities upscaling needs means adding smart functionality is basically free.
That’s why I mentioned looking up the model number, I’m confident what that person said their TV was isn’t what it is.
No “dumb” TV is going to be good at upscaling because of that. OP might think it is, but they’re probably mistaken about something
Smart TV defines the extra functionality, not the processing hardware. If a smart TV has the Wi-Fi and streaming services disabled it is by definition NOT a smart TV.
If a smart TV has the Wi-Fi and streaming services disabled
If it’s disabled that doesn’t mean it disappears…
And no one has linked or provided the model number of a good 4k that doesn’t have wifi.
Maybe that’s what happening? People think not connecting WiFi means something doesn’t have wifi?
Is that what you think?
Edit:
Oh, you’re the one who said they own a TV like that…
You could easily prove your point by providing the model number, but you’re not, you’re just trying to argue.
I don’t see the point in someone acting like that unless your trolling, so if you don’t want to say the model, I don’t see any point in trying to help anymore.
Everything nowdays includes microcontrollers or microprocessors, and often even in-silico (i.e. as hardware not software) implementations of things like decoders.
However it’s a huge range of those things and the vast majority doesn’t have the processing power and/or hardware petipherals to support “Smart TV” functionality.
For example that upscalling functionality can just be implemented in-silico in a separate chip or as part of the die of a SoC microcontroller for pennies, whilst the actual programmable part doesn’t have anywhere the power or memory needed for the code implementing a fancy UI (such as that for a VOD provider such as Netflix) because that would cost tens or hundreds of dollars more (just go check the price of a standalone TV box).
The economics of the thing nowadays do make it worth it for a TV manufacturer to add the extra hardware needed to make the thing a Smart TV (the kinda crap one only costs maybe $10 - $20 or so more) especially if they can use it to shove adverts in front of people to recoup it or sell “Smart TV” as a premium functionality, but that’s not at all the same as the HW for stuff like hardcoded algorithms such as upscalling being capable of running the software implementing Smart TV functionality.
Your “argument” is built on top of a serious misunderstanding of modern digital systems.
you dont need to shove “smart” features into them to have upscaling
any dumb computer monitor has it
?
I googled, and found one high-end monitor with any kind of native upscaling…
And saying something that is on the most expensive monitors has to be on the cheapest TVs doesn’t make a lot of sense…
And that’s not even getting into what kind of upscaling or how a bigger screen size makes upscaling more important.
Like a 22 inch it doesn’t matter, but an 80 it’s really important
Hardware upscalling isn’t needed in a monitor (unless maybe in really really special situations) because it’s almost invariably connected via a digital connection that supports multiple resolutions to a device (such as a computer) with more than enough processing power to do the upscalling itself.
The only situation I can think of were upscalling would be useful in a monitor is one with a VGA connection (mainly to use with really old computers) since that protocol is analog so pretty much any random resolution can come down the pipe quite independently of the monitor’s native resolution so the digital side of the monitor is forced to adjust it (and a proper upscalling algorithm is a lot nicer than something like double sampling the analog signal)
The previous poster was wrong about upscalling being common in computer monitors nowadays (I vaguelly remember it in the early LCD monitor days because of that VGA problem), but that doesn’t mean you’re right about upscalling support being present in a device beimg the same as there being everything in there necessary for full Smart TV functionality - I’m pretty sure upscalling comes as an hardware implementation of an algorithm, so it’s not a generic computing unit with e the right peripherals, computing power and memory to run an Android system or equivalent that just so happens to be running software upscalling.