Emerging technology - Squeeze this...

IT and network managers who need to improve network performance -- meaning all of them -- can hedge their money across three bets: more bandwidth, caching or multicasting systems, and more compact compression. At first glance, compression seems the smartest option: It is typically cheaper than the other choices, it's usually effective immediately, it avoids installation and upgrade complications, and it leverages current investments in bandwidth or caching technologies. Unfortunately, compression has long suffered from several complications that often make it a last choice -- but that may soon change.

Hurdles by the score.

Compression's problems are pretty basic. For instance, it saves the most transmission resources when it runs end to end, desktop to desktop. (There is no benefit from a compression-decompression cycle in the middle of a connection.) That requirement can demand a high level of cross-platform compatibility and often some end-user training. Even more frustrating, compression makes files opaque to many network management tools, since it alters the format of the data that the tools use to identify and understand the traffic.

Compression is also a resource vacuum. The most common compression technique identifies repeated chunks of data and builds lists of these chunks at each end of a connection. The compression software then simply points at a place on the list instead of sending blocks of data itself. These "substitutional" or "dictionary-based" compressors can work wonders but at a cost: They gobble processing power and memory. These resource constraints can bite especially deep for network applications, where the compression algorithm may have only a few milliseconds in which to do its work. As a result, compression users often cut corners -- such as limiting the number and complexity of the objects the system is able to handle -- reducing resource load but also restricting compression efficiency.

Raising the barriers.

Fortunately, computers have begun to deliver enough horsepower to take the roof off serious compression development. The growing density of corporate wide area networks has raised the cost of across-the-board bandwidth upgrades. And the commerce globalisation has made compatibility with the low-bandwidth connections and per-bit pricing found on other continents more important. As a result, a host of new network-centred compression products and services are emerging from laboratories and vendors.

Developers have made considerable progress on compressors specialised for specific types of content, for example. The more assumptions a compressor can make about its material, the better it works. Medical Synergy Technologies of Rochester, New York, uses this fact in its medical dictation system that lets radiologists define common medical conditions and treatments as "repeated objects." With the system, doctors simply point and click from a list objects as they dictate, using speech only to specify how a certain patient differs from the predefined cases. The system then sends only the location of that case in the library on the other end of the connection, radically reducing time and bandwidth demands for both dictation and transcription.

Smart compression arrives.

Some companies have gone one step further. They combine several specialist programs into a library with enough intelligence to know which subject to pick and when. For example, such programs can recognise a spreadsheet file, decompose it into a more succinct set of formulas and data, and then pass that information down the line. The receiving end then recognises the information as spreadsheet data and can regenerate the columns and rows of cells into their original format. Different compressors can do the same for presentation and word processing files. Such library programs may also recognise a file that was compressed with a less-effective algorithm, fetch the original application, decompress the file, recompress it with a more powerful algorithm and then send it on -- all in real-time.

Intelligent Compression Technologies (ICT) makes a suite of such products. Loretta Michaels, director of Nortel Network's Wireless Internet Division, says that ICT's compressors can make a 9600bps or 14.4Kbps connection behave like a 56Kbps connection or better. "A file containing a PowerPoint presentation that would take an hour to download without intelligent compression could be received in four minutes," she says. According to Michaels, the Toronto-based Nortel was so impressed by ICT's compressors, which it originally bought for internal use, that the company is incorporating them into products of its own.

ICT isn't the only advanced compression company out there. Expand Networks of Roseland, New Jersey, builds network accelerators that save bandwidth by using several techniques. In one way, the accelerators can recognise the data packets in a single communications session, strip out their headers (which, by definition, are all the same), and replace them with a two-bit session number. Feargal Ledwidge, network manager for California-based Wyle Electronics, says he's seen a threefold to fourfold increase in effective bandwidth by using the products. "Without these accelerators we would probably have had to double our bandwidth purchases," he says. Tomer Zaidel, technical manager for Internet Gold -- one of Israel's largest ISPs and another Expand customer -- points out that compression is especially valuable when it replaces relatively costly international bandwidth. For his company, he says the Expand products produced a return on investment in a matter of weeks.

Close enough for jazz.

While a spreadsheet or network traffic needs to be identical -- bit-for-bit -- on both ends of a compression chain, other data types aren't nearly so finicky. Speech, music and video can tolerate an error or two without serious impact on the receiving end. Knowing that, some developers have created techniques that send just the essential elements of the communication, at the cost of introducing (hopefully) unimportant differences between what the sender transmits and the recipient receives. These so-called lossy algorithms have a significant advantage: They can be adjusted to send more or less exact versions of a file depending on the users' needs and network conditions. Workfire.com of Scottsdale, Arizona, for instance, makes a product called Workfire Server that can store several compressed variants of a given file, detect the connection speed of an end user and then pick the appropriately sized variation -- faster connections get bigger files, slower connections get smaller ones. And Lucent Technologies has successfully demonstrated (but as of this writing not yet marketed) a system that senses network conditions and adjusts compression in telephony applications accordingly.

Even the management penalties imposed by compression are gradually dissolving. NetScout Systems recently released a Decompression Probe that solves the problem of indecipherably compressed network traffic. The product takes a slice out of the flow of compressed data, identifies the program that compressed it, decompresses the slice and hands the necessary information over to the proper monitoring and analysis programs. John W. Parsons, manager of Global Telecom Planning and Design at Eastman Kodak says his company successfully uses the NetScout Probes to analyse network traffic into applications such as SAP and Lotus Notes.

Measuring success.

While all these tools seem to offer significant benefits, the increasing complexity of the compressor market means that IT managers will need to create their own standards for success and return on investment. And even some vendors admit that there are no guarantees. "Nobody's claims should be accepted at face value," stresses William Sebastian, president of ICT.

But while many people think of compression as a temporary fix -- useful only until we can enjoy the ocean of bandwidth promised by analysts such as George Gilder -- history suggests that human needs will expand even faster than bandwidth technology can provide. HDTV can already gobble up 6Mbps. In a few years we may find ourselves with even more data-hungry wall-sized displays, panoramic video and 3-D images. The ultimate video technology, holography, could suck up terabits of bandwidth and still ask for more. Given that, unless human appetites change remarkably, IT managers will be buying better compression for many years to come.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

More about Eastman KodakExpand NetworksKodakLucentLucent TechnologiesNETSCOUTNetScout SystemsParsonsSAP Australia

Show Comments
[]