Notwithstanding many fits and starts with little to show for all the efforts so far, the trend lines are aligning to finally begin moving second-screen TV applications into the consumer mainstream, paving the way for content suppliers and advertisers to exploit the power of personalization to more deeply engage viewers and drive new revenues.
The core trend behind the emergence of what might be called augmented TV is the growing use of smartphones, tablets and laptops not just for multitasking while viewing TV but for conducting search and viewing activities on those screens that are directly related to what’s being watched. In a recent study performed by Sterling Brands and Ipsos for Google, researchers reported that with 77 percent of overall TV viewing time in the U.S. now occurring while consumers are engaged with other devices, 22 percent of that time is spent on usage that complements TV viewing.
Similar trends hold elsewhere. Looking at German users, Decipher Research concluded in a study performed for Rovi last year that 79 percent of TV viewers have used smartphones to search for information related to what they’re watching.
Other key developments include the maturing of enabling technologies, including highly sophisticated search and discovery systems and the means by which second-screen apps are synched up with split-second accuracy to whatever the viewer is watching on the TV. Standardization of mobile reference applications and HTML5 browser experiences is another key trend, notes Tom Wilde, CEO of RAMP, which provides technology supporting augmented viewing experiences on websites and second screens.
“I’d be shocked if there are any TV programmers out there who don’t have plans to try to figure this out in 2013,” Wilde says. “We know they’re all working on it.”
But, as Wilde acknowledges, it’s going to take a lot more experimentation with viewer reaction to various strategies before augmented TV becomes a regular feature of the TV viewing experience. “Second screen is still a work in progress where programming networks need to figure out what customers will respond to,” he says.
And that work needs to be accomplished before anything significant happens on the advertising front. Much as RAMP has seen with development of augmented viewing experiences on websites, advertisers on the augmented TV side will need to see high levels of user engagement followed by development of useful analytics before they’ll take the plunge. “On the Web we’re just crossing from phase one with the user engagement to where we’re starting to get the metrics,” Wilde says.
“It’s premature to think we’ll see real advertising opportunities emerge for second screen TV viewing in 2013,” he adds. “The comments I get are, ‘Is it going to work? Probably. How? No one knows.’ It’s going to take another year of work to create the analytics that tell us here’s how users behave in this mode and here’s what the advertising opportunity is with these kinds of creative models and offerings.”
Nothing is more important to driving the second-screen opportunity than the power of personalized search enabled through advanced discovery technologies that transcend the limitations of EPGs running on traditional set-top boxes. Of course, these navigational capabilities apply not just to finding programs for viewing on TV sets but also to finding content for viewing on the connected devices themselves, which greatly increases vendors’ incentives to embed advanced navigation systems with the devices or at least to offer them as downloadable apps.
As frequently reported in these pages, OEM and other types of partnerships between suppliers of discovery technology and providers of middleware for every type of device from traditional set-tops to smartphones are cropping up everywhere. A new case in point is the just-announced alliance between middleware supplier Alticast and ThinkAnalytics, a supplier of discovery technology that is used to one extent or another on navigation systems serving more than 80 million viewers in 16 countries worldwide.
Alticast will integrate ThinkAnalytics’ recommendation engine into its Windmill multiscreen distribution platform to bring personalized, multiplatform recommendations and intelligent navigation to Alticast’s customers in Europe, North America, Latin America and Asia, says Alticast CMO Thomas Jung. “Having robust and effective search and recommendation functionality is a crucial piece of the puzzle for any operator or device manufacturer that wants to deliver a full and exciting multiscreen experience to consumers,” Jung says.
Alticast, with a strong focus on MVPDs (multichannel video programming providers), points up how important multiscreen navigation and, with it, second screen-based TV navigation, has become to this sector. Commenting on a recent study by ABI Research, Sam Rosen, practice director of TV and video at ABI, names other entities in the hunt for MVPD advanced navigation contracts, including “established digital media companies such as Rovi and Technicolor, TV middleware companies (notably, Viaccess Orca with its COMPASS recommendation technology), together with a set of innovating companies, including Digitalsmiths, APRICO, and Gravity R&D (winners of the Netflix prize for improving search algorithms).”
Second screen experiences are becoming an ever bigger part of MVPD agendas, Rosen adds, noting, “These systems typically leverage cloud-based technologies to compensate for the older technology in the home.”
But it’s the programmers who will be the primary drivers to augmented TV via second-screen apps that go beyond TV navigation, asserts Wilde. “We’re skeptical that MVPDs, Hulu, Microsoft Xbox or whoever in the aggregator space can deliver this experience without programmers’ taking the initiative,” he says.
“Comcast doesn’t know more about HGTV or Discovery than they know about themselves and their viewers,” he continues. “So the programmers have to come up with the relevant user experiences and make those available to all these edge points, be they MVPDs or OTT aggregators. This is what has to happen first.”
While MVPDs have a great incentive to move ahead with branded second screen experiences, they haven’t been able to execute on their own, Wilde says. “We’ve talked to many of those guys, but every time we walk out of the discussions scratching our heads, because, while their vision is good, there’s no way they can execute by creating compelling experiences by themselves.”
Meanwhile, with RAMP’s focus on enabling augmented TV from the programming side Wilde is seeing a “new area of tension opening up” between MVPDs and programmers. “Clearly, MVPDs want the brand recognition with the immersive experience,” he says. “But you’re seeing the door open a crack to where that’s not the case with initiatives like HBO Go and Watch ESPN. It’s a little bit dangerous for MVPDs to allow those experiences in, but it’s smart of the programmers to say, ‘I don’t want this to be a FiOS or Xfinity thing.’”
That said, he adds, there are probably only ten or 12 networks with enough marketing clout to pull off such efforts without a branding tie-in with each MPVD. “ESPN was the first programmer ever to force YouTube to embed a third-party player,” he notes. “Who owns the customer experience will be a factor in how second-screen unfolds.”
For a look at what the future of augmented television looks like one need only go to the Web sites of media companies that have engaged RAMP’s technology to enable augmented experiences in that space. Customers include NBC Universal, Fox Sports Interactive, Sony’s Crackle, Better Homes & Gardens and many others – nearly 100 firms with RAMP apps running on some 500 sites, Wilde says.
RAMP’s technology has its roots in speech-to-text and natural language processing technologies developed at the Boston-based R&D firm BBN through a $100-million Defense Department-funded research program. RAMP was spun out of BBN in 2007, leading to launch of a series of products in the ensuing years that allow media publishers to make any kind of content highly discoverable through faceted parsing of data from video, text, audio and images.
Through its MediaCloud platform RAMP pulls the data aggregated from this parsing process into a cloud-based index of metadata, Wilde explains. Utilizing time-coded transcripts with the other data obtained from content streams and Web crawling, the platform creates topic indices that optimize video content for site searches and myriad applications.
For example, the content optimization engine can generate thumbnails of video content every ten seconds, bubbles of text describing content as the viewer watches and other components to augment the viewing experience. In the case of the Better Homes & Garden site, a video playing in the main field of vision is complemented by images and clips relating to what’s being discussed in the primary video. This is done automatically with minimum manual preparation through RAMP’s newly announced MetaQ real-time rules engine, which reacts to keywords in the video and draws from other content sources to bring the ancillary content to the right-hand column space.
“Some of what’s being delivered is tied to monetization,” Wilde says. This includes complementary advertising, but, perhaps more important, the ancillary video thumbnails that appear in the right hand column induce viewers to click through to those videos, resulting in more exposure for pre-roll ads associated with that content. “Our customers are very keen on getting users to watch more,” he says.
Now these capabilities are moving to second-screen applications, starting with the just-announced plans of the annual People’s Choice Awards ceremony airing on CBS in early January to employ RAMP’s MediaCloud to deliver bios, news stories, tweets and other content precisely synched to what’s showing on the TV. The People’s Choice website has long used RAMP to deliver ancillary content with video from the live TV program, the key difference being that, with the second screen experience, everything has to be synched with the TV program on a frame-by-frame basis.
“Synchronization is very difficult and very important,” Wilde says. “We have unique technology supporting synchronization which we’ll be talking about later in the year.”
This new technology is designed to work with whatever means of automatic content recognition (ACR) RAMP’s customers choose. “ACR is not synchronized TV, it’s a component of it,” he says. “Once you have ACR you still have to orchestrate synchronization. Many people will use ACR, but there will be a path where ACR isn’t necessary. We’ll work in this environment as well.”
In exploring what needs to be done to meet TV programmers’ demands for accurate, reliable synchronization the “number one thing we heard was, how do we do live TV?” Wilde says. “We put our heads down and came up with a way to get to what our customers were asking for.”