The pace toward market adoption of high dynamic range-enhanced video has accelerated with the launch of HDR content by Amazon, Walmart’s Vudu, 21st Century Fox and indications of commitment to rollouts in the near future on the part of Comcast, Netflix and others.
While there’s continuing debate over HDR formats, there’s now a lot more clarity on some points than there was a few months ago, thanks in part to the Blu-ray Disc Association’s (BDA’s) release of the new Blu-ray Ultra HD specifications and a bulletin on recommended HDR specifications from the Consumer Electronics Association (CEA). But there’s still a long way to go as content creators, vendors and distributors wrestle with key issues such as bandwidth impact, when the quality of TV displays for mainstream consumption will be sufficient to merit delivering HDR-enhanced content and finding the balance between too much and too little in the way of eye-popping dynamism (see p. 8).
Driving progress is a widespread desire to deliver a dramatically better consumer viewing experience without waiting for content formatted to the full pixel density of 4K to emerge. As noted by Matthew Goldman, senior vice president of TV compression technology at Ericsson and executive vice president of the Society of Motion Picture and Television Engineers (SMPTE), ever greater numbers of 4K UHD TV sets are entering the market, bringing with them the ability to upscale HD 1080p through rendering tricks that mimic the effect of the higher pixel density delivered by 4K at 3840 x 2160 pixels.
As a result, Goldman notes, there’s little difference in what the viewer sees at a reasonable viewing distance between pure 4K and upscaled 1080 p HD content. While 1080 interlaced content presents some challenges to the upscaling process, they’re not insurmountable, he adds.
“The rule of thumb is you get the full impact of HD at three picture heights’ distance – about two and a half meters, which is a typical viewing distance in people’s homes,” he continues. “To get the full benefit of 4K, which is to say, to see a real difference, you have to be half that distance from the screen, which is not what people are accustomed to.”
All of this has been born out in focus group tests and by providers of 4K-originated content, who, as previously reported, have found consumers to be underwhelmed by the experience. In contrast, Goldman notes, with HDR, whether it’s offered in conjunction with 4K or 2K pixel density, “You can see the impact of HDR from across the room.”
Given the bandwidth constraints, delivering 4K at four times the density of HD is a potentially costly proposition with a marginal return on the allocation of network capacity compared to HDR. “It’s all about more bang for the bit,” Goldman says.
Just what the price to be paid in added bandwidth for HDR apart from 4K turns out to be depends on many factors, but it will be much lower, even though the sampling rate for encoding HDR (and 4K) is at least 10 bits versus the 8-bit encoding the industry uses today. ”Our experiments have shown the bitrate increase could be as little as 0 or maybe as much as 20 percent, depending on a variety of factors,” Goldman says.
Ericsson has found that with today’s dynamic encoding capabilities, where the level of compression and hence bandwidth utilization varies widely across picture sequences, HDR in dark spaces actually allows more compression without affecting picture quality than is the case with HD. This tends to balance out bright areas where capturing the nuances requires reducing the amount of compression and raising the bitrate compared to the bitrate for comparable scenes in HD, notes Carl Furgusson, head of strategy for business line compression at Ericsson.
HDR, by definition, means there is more non-linearity in the acceptable compression rate frame to frame than is the case with the standard dynamic range (SDR) used in traditional television. As embodied in the ITU Rec-709 standard, SDR has a luminance range of 100 nits (candelas per square meter) with a color gamut of 16.78 million colors while HDR supports many hundreds or even thousands of nits and billions of colors, depending on which HDR format is used, the type of display and which of two advanced color gamuts is in play, the current cinematic DCI P3 standard or the ITU’s Rec-2020.
“We’re finding with HDR the fact that you can reach deeper levels of black means artifacts that a high level of compression of SDR content might produce aren’t perceivable, so you can go to that level of compression with HDR,” Furgusson says. “At the top end of brightness you see more artifacts with HDR than HD at a given level of compression, so you have to spend more bitrate to avoid that.”
More testing will be required to develop guidelines around bitrate expectations with HDR, he adds. Asked whether the impact range of 10-15 percent in additional bandwidth that CableLabs has measured for HDR was in the ballpark, he replies, “You can’t go with a straight rule of thumb on this. But HDR will not have the impact on bitrates people anticipated.”
HDR formats, like Dolby Vision, that use 12-bit rather than 10-bit encoding will have a greater bandwidth impact, possibly in the 20-25 percent range, says Patrick Griffis, who serves as executive director of technology strategy in the office of the CTO at Dolby Laboratories and education vice president at SMPTE. But if backward compatibility isn’t required, the dual streams can be compacted into one, resulting in a lower bandwidth penalty. “We’ve worked with major chip vendors to ensure both options are included,” Griffis says.
At IBC in Amsterdam this month, Envivio, set to be acquired by Ericsson pending shareholder approval, offered a dramatic demonstration of how its encoding technology can be used to lower the bitrate of a single-stream version of Dolby Vision HDR with 4K pixel density. The demo, running on a new 1,000-nit Vizio display slated for commercial release by year’s end, showed fast-motion trailer clips from an HDR-enhanced version of the motion picture Oblivion at a bitrate of just 12 Mbps.
The bitrate impact of the single-stream version of the 12-bit Dolby Vision system on the total bitrate was between 1 and 2 Mbps, or somewhere between 10 and 20 percent, according to an Envivio official. The compromises in the encoding process that had to be made to reach the 12 Mbps bitrate were cleverly obscured with use of aggressive compression levels outside the areas of viewing focus in any given scene, resulting in a high-quality viewing experience that won the requisite approval for the demo from Oblivion director Joseph Kosinski.
The idea of making HD a part of the HDR paradigm, now labeled as HDR+, is really a return to the original concept of UHD, which was that it was more about dynamic range than pixel density, Griffis notes. “UHD is really about creating an immersive viewing experience, but the industry got sidetracked for a while by the CE manufacturers’ introduction of 4K TV sets, which focused discussion of UHD on pixel density,” he says. “Now things are getting back to the original intent.”
That intent has been bolstered by the development of SMPTE 2094, a draft standard moving to adoption that introduces more dynamism into the color transformation process than provided by SMPTE 2086, otherwise known as the “Master Display Color Volume Metadata Supporting High Luminance and Wide Color Gamut.” SMPTE 2086 serves as the metadata component referenced by the SMPTE 2084 Electro-Optical Transfer Function (EOTF), an alternative to the traditional Gamma or Opto-Electric Transfer Function (OETF) that is now part of the BDA’s UHD standard and the HDR specifications recommended by the Consumer Electronics Association.
By providing a means of assigning a dynamic brightness dimension to the rendering of colors by TV displays, SMPTE 2094, an adaptation of the metadata instruction set used in Dolby Vision, brings the full HDR experience into play with 10-bit as well as 12-bit sampling systems. With SMPTE 2084 and ITU Rec-2020 color gamut now specified in both the BDA’s and the CEA’s HDR profiles, a fairly clear HDR roadmap is in place, leaving it to individual industry players to determine whether they want to utilize 12-bit formats like Dolby Vision and the one developed by Technicolor or the 10-bit formats offered by Samsung and others.
Dolby at its IBC booth offered a dramatic demonstration of what can be done with Dolby Vision using SMPTE 2094 to map HDR-originated content in the transfer function on both HDR and SDR displays. The demo, using a 2,000-nit display not yet commercially available, offered a dramatic view of what HDR will look like as such displays enter the market.
At the same time, with a real-time 2094 mapping capability, Dolby also showed how the HDR-originated content delivered through the Rec-709 component of the dual-stream Dolby Vision system could be rendered to greatly enhance the SDR display of that content in comparison to an SDR display of the content that didn’t use the SMPTE 2094 mapping capability. This is another factor that will have to be weighed as network-based distributors consider backward compatibility and its impact on bandwidth.
“The industry is split on the backward compatibility question,” says Ericsson’s Goldman. “Some say it’s essential; others don’t think it’s necessary.” Even for those that do favor backward compatibility, there are other approaches under consideration besides reliance on the dual-stream option that involve separate processing of HDR content for SDR distribution.
But with higher-luminance displays supporting the 68.7 billion colors enabled with REC 2020 on the near horizon, distributors will have to begin deciding which way they want to go with HDR formats and sampling rates sooner than later. Given the added benefit of enhanced SDR quality stemming from use of SMPTE 2094 with the dual-stream 12-bit format, there’s a new benefit that will have to be considered in determining whether the 12-bit dual-stream route is worth the extra cost in bandwidth.