The solution, according to Iyer, are network memory algorithms that combine load balancing and caching algorithms on commodity memory. Load balancing distributes the load over slower memories, and guarantee that memory is available when data needs to be accessed; caching guarantees that data is available in cache memory 100% of the time.
"They provide hard guarantees, mathematical guarantees that performance would never fail," Iyer says.
Applications for network memory include buffering, NetFlow accounting and quality-of-service (QoS).
For buffering, routers must make sure that the packets it needs are always in the cache. With a small SRAM cache inside a packet processing ASIC and a slow commodity DRAM, it is possible to build a huge, fast, low power packet buffer using network memory algorithms, Iyer says.
For QoS, network memory enables routers to better provide strict performance guarantees for critical applications, such as remote surgery and supercomputing. And they help maintain state for applications such as NetFlow, which collects IP traffic information for monitoring purposes.
Network memory techniques are currently being designed in Cisco's next generation port speeds of 10G and 40Gbps, Ethernet switches and enterprise routers, Iyer says. They are also being designed into the company's next generation 100Gbps router line cards, he says.
In addition to making memory more efficient and enhancing router performance on high-speed links, the algorithms reduce board space real estate by reducing pin counts on packet processing ASICs, and reducing the number of on-board ASICs, Iyer says. That, in turn, approximately halves the physical area to build a router line card, leading to considerable reduction in power requirements for high-speed systems, he says.
Still, network memory technology will need to continue to evolve as routers become faster and more complex, and memory requirements attempt to keep pace with new applications designed to harness higher and higher speeds, Iyer notes. This will require continued improvement and innovation in caching and load balancing technology, he says.