Check out the new USENIX Web site. next up previous
Next: Impact of Transmission Loss Up: Simulation Results Previous: Bursty Traffic Patterns

Impact of Varying Simulation Parameters

We then perform sensitivity analysis to study the effect of varying the parameters of the topology. Figure 8 presents the various metrics as the bottleneck bandwidth is varied. The rate of flow arrivals is set such that the offered load to the system is 60% of the bottleneck capacity for the various runs of the experiment. At lower capacities, TCP's slow-start phase overruns the available bandwidth resources, causing packet loss storms, resulting in substantial back-off and increased transfer times. TCP's transfer time performance levels out with increasing bandwidth, but never approaches the performance of PCP due to the $O(\log n)$ overhead associated with the startup phase.

Figure 9 illustrates the performance of various flows through our base configuration of a 40 Mb/s bottleneck router as we vary the round-trip latency of the flows. We again consider fixed-size flows of length 250 KB, and we also fix the offered load at 60% (twelve new flows per second for this configuration). The average round-trip latency is varied from 5ms to 100ms, and the buffer space is set to the corresponding bandwidth-delay product for each run. At small RTTs, TCP flows tend to blow out the small router queues rather quickly, while at high RTTs, the $O(\log n)$ slow-start overhead translates to much higher transfer times. PCP flows track the performance of fair queueing under all RTT conditions.

We also study performance as we vary the mean flow length. Figure 10 graphs the various performance metrics as we vary the flow size and correspondingly vary the arrival rate in order to fix the offered load at 60%. As we study the performance of TCP flows, we observe a tradeoff between two competing phenomena. As we increase the flow lengths, the initial slow-start overhead is amortized over a larger transfer. The resulting efficiency is however annulled by increased loss rates as there are a sufficient number of packets per flow for TCP to overrun buffer resources during the slow-start phase.

Figure: Effect of varying bottleneck bandwidth. Average RTT is 25 ms, flow lengths are 250 KB, and interarrival times are set to operate the system at 60% load. Fair queueing and PCP suffer no packet loss.
\epsfig{file=graphs/bwvar/lat.eps, height=1.1in,width=1.7in} \epsfig{file=graphs/bwvar/latdev.eps, height=1.1in,width=1.7in} \epsfig{file=graphs/bwvar/qsz.eps, height=1.1in,width=1.6in} \epsfig{file=graphs/bwvar/loss.eps, height=1.1in,width=1.7in}

Figure: Effect of varying round-trip time. Bottleneck bandwidth is 40 Mb/s, flow lengths are 250 KB, and interarrival times are set to operate the system at 60% load. Fair queueing and PCP suffer no packet loss.
\epsfig{file=graphs/latvar/lat.eps, height=1.1in,width=1.7in} \epsfig{file=graphs/latvar/latdev.eps, height=1.1in,width=1.7in} \epsfig{file=graphs/latvar/qsz.eps, height=1.1in,width=1.6in} \epsfig{file=graphs/latvar/loss.eps, height=1.1in,width=1.7in}

Figure: Effect of varying flow size. Bottleneck bandwidth is 40 Mb/s, average RTT is 25 ms, and interarrival times are set to operate the system at 60% load. Fair queueing and PCP suffer no packet loss.
\epsfig{file=graphs/flowvar/lat.eps, height=1.1in,width=1.7in} \epsfig{file=graphs/flowvar/latdev.eps, height=1.1in,width=1.7in} \epsfig{file=graphs/flowvar/qsz.eps, height=1.1in,width=1.6in} \epsfig{file=graphs/flowvar/loss.eps, height=1.1in,width=1.7in}


next up previous
Next: Impact of Transmission Loss Up: Simulation Results Previous: Bursty Traffic Patterns
Arvind Krishnamurthy 2006-04-06