Next: What to Prefetch
Up: Introduction
Previous: Where is Prefetching Applied
The goal of prefetching is to make data available in the cache before the data
consumer places its request, thereby masking the latency of the slower data
source below the cache.
However, prefetching is not without cost. It requires (i) cache space to keep the
prefetched data; (ii) network bandwidth to transfer the data to the cache;
(iii) data source bandwidth to read the data; and (iv) processing
power to carry out the prefetch.
If the prefetched data is not subsequently used by the data consumer,
the extra cost of prefetching normally reduces performance.
Only in over-provisioned systems, can prefetching with low predictive accuracy
improve performance. However, the data cache is obviously under-provisioned as it
can keep only a subset of the data-set. The prefetched data typically shares the
cache space with demand-paged data. Therefore, the utility of the prefetched data should not
be lower than the utility of the demand-paged data it replaces. To maximize the
performance, the marginal utility of both kinds of data should be
equalized [17]. Since the utility of prefetched data that is not
subsequently used is zero, it is extremely important to prefetch judiciously,
keeping the number of wasted prefetches to a minimum.
Furthermore, any prefetching algorithm needs to be able to predict
accesses sufficiently in advance to allow for the time it takes to prefetch the
data.
As a rule of thumb, prefetching is useful when the long-term prediction accuracy
of access patterns is high.
Next: What to Prefetch
Up: Introduction
Previous: Where is Prefetching Applied
root
2006-12-19