This made me think of HDR. While it’s something that’s even less common than 4k, the difference is that HDR doesn’t really take any extra hardware grunt – it’s literally just the difference between rendering at 24/32bit color per pixel vs 30/40 bit color per pixel. I mean, if you run a PC game at 16bit vs 24/32bit on pretty much any non-Intel GPU made in the last decade, you would have noticed that there’s basically no performance difference (unlike 20 years ago).
Even on the software side, game engines have been internally rendering at wide color ranges for a while now, especially since most GPU hardware in the last decade natively work at 32bit floating point (which is way more range than necessary for HDR). This “32bit floating point” is in fact where the term “flops” comes from with regards to calculations (such as teraflops) – it literally stands for “floating point operations per second” and typically refers to single precision (32bit float) or double precision (64bit float) calculations.
The tl;dr version is that HDR would help appease graphics lovers at basically no cost to non-HDR users since it wouldn’t increase hardware performance requirements at all, and it doesn’t even really need new hardware (the PS4 got patched to support HDR10 even though the system launched more than a year before the standard was finalized).
We’ll find out tomorrow. Reggie’s comments were “we will talk about the graphical differences between Breath of the Wild on Wii U and NX later.” This could be as simple as NX being more powerful than Wii U and could display more shaders or something.
Or it could be HOLOGRAMS!!! OMG!!!!!!