Dave Evans is the chief futurist for Cisco Systems and the chief technologist for the Cisco Internet Business Solutions Group. This is a guest post for ESPN.com.
The 2012 London Olympic Games provided one of the most connected global experiences ever. Myriad streams of content were available on TV and online, not to mention on multiple devices, including smartphones, tablets and desktop computers.
But the world isn’t finished innovating and creating new ways to enhance the viewing experience for future large-scale sporting events, such as the Olympics and the World Cup.
In the past, TV networks covering the Olympics delivered only prime-time evening coverage of what happened that day, with taped highlights of various events tightly packaged into several hours of programming. That approach changed dramatically with the 2008 Beijing Olympics when NBCUniversal, the provider of Olympics coverage for American audiences, offered not only hours of TV and cable coverage but also thousands of hours of coverage online at NBCOlympics.com.
NBCUniversal followed a similar formula at this summer’s London Olympics, providing live-streaming or taped coverage of 26 medal sports at the Games. The content was optimized for viewing on multiple devices.
The "connectedness" of the 2012 London Games was illustrated by the 9.66 million tweets generated by people watching the opening ceremony on July 27 -- more than the entire volume of tweets for the 2008 Beijing Games. NBC reported that it delivered 3,600 hours of programming on its main network and various cable channels for the Beijing Olympics, compared with 2,200 hours of streaming video online.
The proliferation of tablet computers that began with the appearance of the iPad in 2010 boosted demand for streaming video on those devices in 2012. During the London Games, viewers streamed a total of 20.4 million hours of video, according to NBC. Additionally, NBC reports that 30 percent of the total video streams were viewed in the two mobile applications.
Given the increasing sophistication of technology, the availability of high-definition still-image and video cameras and the expanding bandwidth of computer networks, the ability to enhance the user experience is constantly evolving. And not just for viewers, but for athletes, coaches and reporters.
Thousands of fans attending global sporting events are using their smartphones to record and post video content to YouTube and social commentary to sites such as Facebook, Twitter and Google+. By the time the 2014 World Cup or the 2016 Olympic Games begin, there could be an order of magnitude increase in the number of video channels available.
But it doesn’t stop there.
Viewers at home will have even more options than what's available today. I foresee technology that will allow home viewers to see what someone in the stands is recording on his or her video camera or smartphone. Because holding a camera steady for long periods of time can be tiring, we could soon see wearable cameras that attach, for example, to someone's glasses. Imagine sitting on a park bench in Boston and watching a match on your tablet through the real-time perspective of a fan's sunglasses in Brazil. The technology advances required to deliver that experience -- enhancements to the device and the camera -- are in the works.
You will be able to switch from person to person to get different angles and reactions. Devices will be GPS-enabled and/or connected to Wi-Fi at the venue, so viewers will be able to triangulate the location of different fans relative to one another. You will see a map of the venue and be able to choose the perspective from which you wish to view the event -- adding levels of depth to watching a long-course event such as the marathon.
In addition, smart devices will deliver more information to fans. Say you want to follow a hometown favorite as he or she competes. You will soon be able to touch the image of that athlete on your tablet computer to call up an array of stats, such as medals won in previous competitions and video of the athlete's qualifying performance. Broadband networks will let viewers share the images they have captured of their hometown hero to view on a laptop or HDTV.
Today, most coverage of global sporting events consists of views from network TV cameras mounted strategically at fixed places of interest, such as the finish line of a track event, below the surface of the swimming pool, in a boxing ring or alongside rowers. But at future events, expect to see those cameras shrink dramatically in size and become untethered. You may view video captured by cameras attached to a javelin as it's being thrown, to a kayak as it courses through the water, to a track athlete as he is jumping the hurdles or to a swimmer as she completes the 100-meter freestyle relay.
We're going to see a much more dynamic broadcast, where the athletes, equipment and playing field all have cameras mounted on them. We're already seeing trends toward tiny cameras and higher-fidelity image sensors.
Just as TV networks pay drivers to put cameras inside race cars, they could pay athletes to wear cameras on their clothes, their glasses or, as technology progresses, even on a contact lens via a tiny, embedded image sensor.
The possibilities expand further as technology at future sporting events -- perhaps even after 2016 -- delivers "augmented reality." That's the name for technology that superimposes additional information on an image. When it comes to Olympic sports, the image from a camera mounted on an athlete could be augmented with data from sensors placed on his or her body. As a result, as the athlete runs down the track, the sensors could display information about his or her heart rate, speed and respiration along with external information such as wind speed and direction. Imagine seeing that information displayed on the screen as Usain Bolt thunders to the finish of the 100-meter final.
Such biometric information combined with the video image could help coaches and athletes train more effectively. Just as football players meet with their coaches on Monday to analyze footage of Sunday’s game, athletes could use up-to-date technology to review and improve their performances.
The coach of an athlete who wins a bronze medal could play a video with a biometric overlay alongside that of the gold-medal-winning athlete to determine the factors that separated first and third place.
Technology already supports part of many athletes' training, such as when a runner wears patches of reflective tape on his or her track suit and sprints past a bank of high-speed cameras that record the stride. The information is beamed to the coach's laptop or tablet, and a software program analyzes the runner's movements. Similar technology is used to study a swimmer's stroke.
We're seeing a new class of data flowing across the network. High-definition video requires more bandwidth, and forecasts show that, by 2016, 90 percent of data traversing the Internet and other networks will be video. The number of mobile connected devices will also continue to proliferate, pumping more data through networks. Plus there are billions of sensors, such as those that record an athlete's vital signs, that transmit data. I expect that the number of such sensors, maybe 10 billion today, will grow to 50 billion over the next eight years -- two Olympics from now. The networks of tomorrow are going to be much more fluid, dynamic and mobile than those of today.
As technology advances and network capacity improves, new ways for people to experience events such as the Olympics, the Super Bowl, the World Cup and the World Series become possible.
Ultimately, technology serves one purpose: to improve the human experience. It's really not about the device; it's about people improving their experiences in ways that work for them. It's about getting closer to the athletes and bringing us closer together as a society.