March 31, 2011
The current Internet consists of tens of thousand different interconnected autonomous networks. It was designed to support large populations of point-to-point content transfers. This introduces considerable headaches due imbalances in traffic flows, which has the effect of complicating network management, reducing the robustness of the Internet, and reducing user performance.
In this talk, I review the sources of traffic in the Internet over the last 20 years. We will observe that this an increasing fraction of traffic was generated by peer-to-peer (P2P) « swarm » technology from 2000 to 2007. I examine how and why P2P swarm technology simplifies network management, and makes the Internet and applications more robust. Unfortunately, the use of this technology has decreased since 2006 and shows every sign of continuing to do so. Thus, the rest of the talk focuses on and characterizes an architecture that encourages and makes swarms full class citizens. I will focus not only on the problem of transfering content but also on the problem of locating content.
Don Towsley (University of Massachussets)