As millions of Americans turned to the internet for news and information about the Sept. 11 terror attacks in New York and Washington, D.C., data traffic periodically overloaded fiber-optic networks, prompting error messages and long waits to load Web pages.
When U.S. Attorney General John Ashcroft gave out the Internet address where the FBI is collecting tips on suspected terrorists, the site quickly collapsed under the weight of those seeking access.
The traffic surges during the national crisis turned out to be temporary, and in many cases were quickly resolved. News outlets like ABC News and CNN dumped bandwidth-eating graphics and video in order to make text available more quickly. Lower-bandwidth traffic like e-mail and instant messaging operated smoothly.
But these events underscore what many see as a steadily growing problem with congestion on data networks: bottlenecks in the peering points, where traffic queues up at switches and routers, is buffered and then either backs up for retransmission or slows to a crawl.
Refocusing the Crystal Ball
The downturn in the economy and the telecom shakeout had led most experts to predict reduced demand on data networks. Wall Street and venture capitalists, which control the purse strings for further development, have been saying for months that fiber-optic networks are overbuilt, resulting in a glut of bandwidth that could go unused for years.
But studies and industry experts now say that data traffic carried by network service providers has instead been growing steadily, perhaps at a rate of at least 30 percent to 40 percent a year. That, they contend, has led to increasing problems with congestion during high-traffic times on public networks, especially where private and public systems converge.
Lawrence Roberts, one of the primary developers of Internet precursor ARPANet and CTO of Caspian Networks, recently released research indicating that Net traffic has quadrupled during the past year alone.
Roberts attributes much of that bandwidth demand to corporations using the Internet as an alternative to more costly private networks. By his estimates, 80 percent of network traffic now comes from corporate data activities.
The drive to expand residential broadband connections through DSL and cable modems complicates the issue. More high-speed connections from home will push the use of bandwidth-hungry applications, like streaming video, and put additional pressure on data networks.
Telecom industry consultancy TeleChoice earlier this month released results of a study indicating that the majority of long-haul networks between 12 major U.S. cities already operates at or near capacity. Of 22 routes studied, only four showed consistent excess capacity, according to the study.
It also concluded that excess capacity on the nations networks will probably evaporate by 2004, requiring upgrades in current networks and new construction. The situation is aggravated by a lack of new capital for network expansion. The study focused on transport networks and did not specifically evaluate network congestion issues.
Those carriers selling bandwidth to business customers are not anxious to talk about congestion. Some critics contend that carriers are not being candid about the extent of congestion problems as the market access to businesses and consumers.
Plenty of Fiber
There is plenty of optical fiber in the ground. The problem is it remains dark.
Network builders filled already open trenches with extra fiber and conduit because its cheaper than digging up right-of-way more than once. And its this excess dark fiber that skew figures — and the public sense — of how much long-haul capacity is available at a given time.
But Blake Kirby, a vice president for Boston-based Adventis consultancy, downplays concerns over congestion, noting that businesses that move significant amounts of data over public networks can buy premium services that give their data priority when nodes are clogged by heavy demand. Or, they can use private networks.
“And MPLS [Multiprotocol Layer Switching] should deal with congestion at most levels as traffic increases,” Kirby says. “I probably would agree that carriers are not aggressive about telling consumers much about congestion issues. But then, consumers arent buying those premium services.
“If someone is complaining about congestion, theyre probably not paying enough,” Kirby adds.
Andy Rice of Chicagos Jordan Industries is assisting Zvolve Systems Inc., a software company based in Atlanta that has developed an intelligent network rerouting system, currently in beta testing with an unidentified major carrier.
Zvolve is trying to attract new resources for further development of its Conscious Network system, developed to specifically address congestion issues without huge capital outlays by carriers.
Yet Rice says the companys efforts are running head on into what he insists are misperceptions that a bandwidth glut makes such devices unnecessary at present.
“There isnt a lot of information out there about the congestion problem and how serious it is, or how the myth of the bandwidth glut is different than this congestion issue,” he says. Its much more significant than people realize.”
The Zvolve system redirects traffic around bottlenecks and onto routes that are not congested, without human intervention. Studies show that the primary routes between the nations major cities, like air traffic routes, are the ones that quickly become congested, resulting in slowdowns and quality-of-service failures.
Rice argues that premium services sold to businesses by carriers often dont guarantee that traffic will flow uninterrupted, and they create a misleading impression.
“The high-priced traffic gets backed up and delayed just like the common traffic does when heavy congestion occurs,” Rice contends. “Then, what happens, is that carriers negotiate with the buyers of premium service to rebate to them the penalties for these incidents. Or, they just dont charge them for a few months.”
Supply and Occasional Demand
IT professionals say their bandwidth needs are affected by the type of traffic computer users demand.
For Rich Pacheco, an IT manager for the University of Massachusetts at Dartmouth, its very specific. He began monitoring the universitys bandwidth use last year with Elron Softwares Web Inspector, and determined that students unrestricted access to bandwidth-consuming activities like file sharing and music downloads were creating enormous traffic volumes, particularly in the evenings when they returned to their dormitories. The analysis helped justify requests that U-Mass purchase more network capacity, Pacheco says.
“Latency was becoming a real issue,” he explains. “Things were timing out. E-mail was very slow. We could identify the spikes as occurring after 5 p.m. That jump-started us to think about the need for more bandwidth.”
Though there is little evidence that businesses are losing data because of network congestion, many do feel the impact because latency may affect employee production or other efficiencies and that translates into increased costs.
Retail businesses that depend on consumer traffic suffer as well because Internet shoppers and other uses are only so patient when pages fail to load, or navigation around a retail Web site slows to a crawl.
As was indicated to the nation in the hours after the terrorist attacks, other problems could result from overloads on wireless or wireline voice communications.