NVIDIA launched their G Sync ultimate specification to quite some fanfare and it originally had extremely strict standards on the nit and gamut rating of the display but it looks like the company is lowering its standards (quite literally) to allow more monitors to meet the mark. While the level of nits required previously was VESA's HDR1000 standard (1000 nits), the new standard now accepts displays as low as 600 nits - which is almost half the luminance rating.
NVIDIA responds to G-SYNC ultimate specification change: "G-SYNC Ultimate was never defined by nits alone nor did it require a VESA Display HDR1000 certification"
Before we go any further, here is NVIDIA's statement on the change (via Videocardz):
Late last year we updated G-SYNC ULTIMATE to include new display technologies such as OLED and edge-lit LCDs.
All G-SYNC Ultimate displays are powered by advanced NVIDIA G-SYNC processors to deliver a fantastic gaming experience including lifelike HDR, stunning contract, cinematic colour and ultra-low latency gameplay. While the original G-SYNC Ultimate displays were 1000 nits with FALD, the newest displays, like OLED, deliver infinite contrast with only 600-700 nits, and advanced multi-zone edge-lit displays offer remarkable contrast with 600-700 nits. G-SYNC Ultimate was never defined by nits alone nor did it require a VESA DisplayHDR1000 certification. Regular G-SYNC displays are also powered by NVIDIA G-SYNC processors as well.
The ACER X34 S monitor was erroneously listed as G-SYNC ULTIMATE on the NVIDIA web site. It should be listed as “G-SYNC” and the web page is being corrected.
— NVIDIA to Overclock3D
While it may seem like a cop-out from the company, it is worth adding that OLED displays do indeed offer infinite contrast (because the pixels are self-emissive) and incredibly deep blacks, and I would happily take an OLED with 600-700 nits over a non-OLED display with 1000 nits any day. That said, Advanced multi-zone edge-lit displays do not offer infinite contrast or deeper blacks and a VESA HDR1000 rated display will almost always outperform the former.
NVIDIA has also mentioned that the rating of the display in question (which could only hit up to 550 nits) was accidental and it falls under the standard GSYNC category and not GSYNC Ultimate (even though GYSNC is powered by the GSYNC processor as well). All that said, a rating is only as good as the products in the market and since the market is clearly not ready to ship 1000 nit displays to everyone (at least not for a few years), it absolutely makes sense for NVIDIA to go down this route and bring more displays into their fold.
With microLED in the works (a technology that can go up to 5000 nits brightness) to replace OLED, we probably only have a few years before the entire display market is disrupted anyway so I am not particularly miffed by their decision here. Until the bezel-less microLED displays become mainstream, OLED will be my preferred choice of display (and also my recommendation to anyone), even over a 1000 nit non-OLED display.