Next: Where is Prefetching Applied
Up: AMP: Adaptive Multi-stream Prefetching
Previous: AMP: Adaptive Multi-stream Prefetching
Introduction
Over the last several decades, we have witnessed remarkable improvements in the information
processing capabilities of computing systems.
A large number of data storage technologies
have also been developed with diverse speeds, capacities, reliability and affordability characteristics.
We often find that cost considerations force us to design systems with a data storage
component which runs significantly slower than the processing unit. To bridge this gap between the
data supplier and the data consumer, faster data caches are placed between the two. Since caches
are expensive, they can typically keep only a subset of the entire data-set. Consequently, it is extremely important
to manage the cache wisely in order to maximize its performance. The cornerstone of read cache management
is to keep recently requested data in the cache in the hope that
such data will be requested again in the near future. Data is placed in the cache only when
requested by the consumer (demand-paging).
Another, and rather competing method, is to fetch into the cache data that is predicted
to be requested in the near future (prefetching).
Subsections
root
2006-12-19