Wondering what the difference is between 4K and UHD? You're not alone.
When searching for the best 4K TVs out there, you may come across various displays describing themselves as 4K UHD.
Buying a TV is no easy task. You'll need to consider if you want a 120Hz TV, an HDMI 2.1-compatible display, and whether you want QLED or OLED. Throw in lots of marketing buzzwords, and you might end up with a sub-par TV.
So, we're here to clear up the difference between 4K and UHD.
4K vs UHD
While 4K is technically different from UHD, TV marketing has captured the term 4K for its UHD monitors.
The UHD resolution has an aspect ratio of 16:9. This is the industry standard for TVs, in comparison to 4K's 1:9:1.
True 4K displays in 4096x2160 resolutions are pretty rare. There are a few monitors that support the resolution, but most modern displays support 3840x2160 instead, and market themselves as 4K.
To make it more confusing, 8K displays that boast a resolution of 7680 × 4320 also come under the banner of UHD, making it necessary to describe displays as 4K UHD or 8K UHD.
Why is UHD called 4K?
After seeing that UHD is not actually 4K, you might wonder why so many manufacturers describe their TVs as 4K UHD. Well, while there's no definitive reason, the most obvious is that 3840 is close to 4000.
However, this still doesn't explain why UHD shunned the previous trend of resolution naming. Its predecessor - Full HD - is commonly described as 1080p.
The name 1080p is in reference to the vertical resolution (1920x1080p). 720p, 480p, and 1440p all adopt this system.
Under this naming system, 4K would actually be called 2160p.
READ MORE: Is it worth buying an 8K TV now?