BRISTOL, Conn.—Many people know ESPN for its broadcast television offerings, which feed almost every sport you can imagine into the TVs of more than 93 million homes worldwide.
But ESPNs digital assets are booming, too. ESPN.com gets roughly 18 million unique visitors per month, and could get more this week as the Colorado Rockies prepare to face the Boston Red Sox for the World Series on Oct. 24 at Fenway Park in Boston.
John Zehr, senior vice president of ESPN Digital Media Production, and his team are tasked with making it possible for people to enjoy the sports viewing experience through platforms other than TVs.
During a meeting at the companys sprawling campus here, Zehr explained to eWEEK how he is responsible for the video on ESPN.com, interactive television and the organizations mobile phone sports offerings.
Zehr is particularly fond of ESPN360.com, a global video-through-PC service that offers 2,500 events over a 12-month period, which is more than the companys flagship channels ESPN and ESPN2 combined. Imagine watching TV through your PC, but instead of watching one sporting event on one channel, ESPN360.com enables users to recreate ESPNs TV control room.
In other words, ESPN360.com enables your PC to become your mission control console, where you can view up to ten different events on six mini-screens, switching back and forth between events and pausing and resuming play in a TiVo-like fashion.
ESPN360.com incorporates Flash and a streaming video technology from Move Networks, an American Fork, Utah, startup that breaks video into small files called streamlets and delivers it over traditional HTTP Web traffic protocols.
“It allows us to CDN (content delivery network) like Akamai or Limelight a lot more effectively and cache some of those assets and lowers the cost and allows you to do higher quality,” Zehr said.
Akamai facilitates Web 2.0 traffic. Click here to read more.
Moreover, Zehr and his team will soon be adding TiVo-like capabilities by mashing up play-by-play data with the ESPN360.com video applet. For example, ESPN will put markers where scoring plays happen, and viewers can go back and replay the plays that are marked.
“You can produce your own highlight reel,” Zehr said.
ESPN also sees mobile as a major focus for its consumers, offering a client-side application called MVP. Available only through Verizon, MVP runs on Web-based mobile phones, playing video and rendering real-time scores.
Next page: How ESPN Uses Technology to Deliver Sports
How ESPN Uses Technology
to Deliver Sports”>
For those viewers hungry for more than just scores and updates, ESPN also offers Mobile TV, which uses UHF spectrum—not cellular signals—to pipe TV through Web phones with just a few seconds latency.
But does ESPN struggle with getting consumers to buy into mobile content, which must be viewed over small, sometimes hard to navigate mobile phone screens?
“Ease of use is a tough one,” said Zehr, likening it to the early days of dial-up access on the Internet. “We spend a lot of time trying to make things easier and to try and influence the carriers. Its getting people habituated to it.”
To read more about the mobile content explosion, click here.
Thats not the only thing that Zehr worries about. ESPN.com gets ridiculously big spikes in traffic, which is handled by servers housed in a datacenter in Seattle.
“Our datacenter is highly scalable but sports sees peaks no one else sees,” Zehr said, noting that the opening weekend of NCAA basketball spurs big peaks, along with the passing of sports celebrities.
“Its something I lose a lot of sleep over,” Zehr said of the traffic spikes.
Currently, the datacenter for ESPNs digital assets is comprised of a lot of dedicated servers and storage designed for specific functions.
Click here to read more about the hunger for virtualization.
There are no quick knobs the administrators can turn to move systems from, for example, baseball to football to accommodate traffic spikes.
Eventually, Zehr said he hopes to move to a virtualized environment, such as software from VMware, or physical servers from Azul Systems, which allow companies to scale capacity and processing power without adding physical infrastructure.
ESPNs TV assets, the companys old-line, bread-and-butter business, use their fair share of IT, too.
Chuck Pagano, executive vice president and CTO, who is responsible for the technology that supports ESPNs vast TV network, took eWEEK first to the “ingest room.”
Located in the campus digital center, the ingest room is spacious and loaded with television screens, computers and enough seats for 65 people crunching data.
Next page: How ESPN Uses Technology to Deliver Sports
How ESPN Uses Technology
to Deliver Sports”>
The ingest room is, Pagano explained, the organizations stomach, absorbing all of the video from the outside world that ESPN must process for production. Pagano said employees in the ingest room serve as enzymes, breaking down the video and putting content into the companys meta database.
“In one 24-hour period, we probably record anywhere from 250 to 300 hours of video and audio content and it doubles on the weekend days,” Pagano said, noting that ESPNs network is capable of running at 320 gigabits per second of network capacity supported by roughly 300 terabytes of online disk storage.
The data is digitized, tagged with descriptors to help production staff recall it, dumped into digital media servers and put in native high definition or native standard definition. Feeds go from the digital media servers to one of the many edit rooms for polishing.
Pagano then lead eWEEK to “mission control,” a production room where the famed Sport Center is produced every night. There, producers and directors sit in front of a bank of nine high-resolution rear projecting screen displays.
Read more here about Ems storage virtualization moves.
The production crew pulls clips off of servers to splice in with the sportscasts, and anything on the monitor wall can be virtually recreated on the fly from a file server. There are two more such quality control rooms, so if the gear in the main one goes buggy, one of the other two takes over.
The last major room the ESPN video hits is the master control room, which Pagano refers to as the “cash register.” It is to this room where the polished product goes to be packaged with commercials and promotions and then beamed out from ESPN via one of the satellite dishes that dot the campus.
Content leaves on a fiber optic cable to the earth station up the hill, and is transmitted to a satellite. One 10-meter dish transmits domestic product up to a satellite, while another provides a transatlantic connection sent to London.
All of this, of course, entails quite a bit of power consumption. Pagano said he and his team are more worried about what they are having for dinner and whether or not the beer is cold than they are about running out of power or bandwidth.
“The whole facility is on an uninterruptible power supply,” Pagano said. “We go to the Nth degree to make sure we have more robust systems.”
Check out eWEEK.coms for the latest news, reviews and analysis on Apple in the enterprise.