After years of hype (and several, more recent years of near-total invisibility), the 3D revolution is finally dead. LG and Sony were the last two companies backing the standard, and both have dropped all support for the format on modern televisions. Any 3D content you still own can be played back on any TV that supports it, of course. But none of the companies that were at CES 2017 are shipping it on any panels.
3D TVs debuted at a time when 3D was making broad inroads across computers and televisions alike. The high-profile success of multiple 3D movies made theaters salivate, thinking of the increased revenue from premium ticket sales — sales that, theoretically, would also drive increased sales of 3D movies on Blu-ray. Sony patched 3D support into the PlayStation 3, while Nvidia put a hefty push behind it on GeForce cards (AMD also had its own 3D implementation, but it mostly focused on pushing multi-monitor gaming during the same period).
3D TV was supposed to be the second coming. Instead, it fizzled. “3D capability was never really universally embraced in the industry for home use, and it’s just not a key buying factor when selecting a new TV,” LG’s Tim Alessi told CNET. “Purchase process research showed it’s not a top buying consideration, and anecdotal information indicated that actual usage was not high. We decided to drop 3D support for 2017 in order to focus our efforts on new capabilities such as HDR, which has much more universal appeal.”
Why 3D TV failed
There were a number of reasons why 3D TV failed in the market, and some cautionary lessons for VR fans (including myself). First and foremost, 3D content was gated to expensive equipment purchases. It wasn’t enough to have a Blu-ray player; you had to have a Blu-ray player with 3D support and a TV that offered the same. Many 3D TVs required you to either have a pair of glasses for each person or, in the case of TVs that didn’t require glasses, had limited viewing angles and distances.
A certain amount of living room finagling is nothing new to TV watching, but this was a larger problem than just rearranging a few chairs. It was difficult and expensive to rig a living room for multi-person 3D viewing, and you had to have enough 3D glasses to fit your entire audience.
Another major problem? Content. A handful of movies made for and shot in 3D, like Avatar, may have popularized the format, but few movies were filmed to take full advantage of it. Many limited themselves to using 3D in specific scenes and were filmed in 2D before being converted for 3D. It’s cheaper (or seems to be, based on how many people went this route) to convert films in post production than to film them in 3D from the beginning. It’s one thing to ask people to pay for The Next Big Thing, and something else entirely when they’re shucking out premium cash for a TV, a movie, and extra goggles, all while knowing that only 20-30 minutes of a film may be truly 3D in the first place.
In addition, 3D is also prone to giving some people headaches and motion sickness, which again, can make it harder to watch a film or 3D content. The third time Grandpa runs for the bathroom or your kid decides to paint the 3D glasses black because it makes them look cool, you’ll wind up wishing you’d saved money and just bought the regular TV and Blu-ray instead.
What does this mean for VR?
The story of 3D’s rise and fall is a cautionary tale for the VR industry as well. I love VR and would like to see it shape the future of gaming, but many of the issues that doomed 3D TV and 3D content could also kneecap VR adoption. Like 3D, it requires expensive, personal peripherals. Like 3D, games need to be designed explicitly for VR in order to showcase the technology to best effectiveness. Like 3D, VR can cause nausea and headaches. Like 3D, working in VR has an entirely new set of best practices, some of which aren’t intuitive to people who spent their careers working on conventional design.