Big Wins with a Small Application-Aware Cache
Julio López
David O'Hallaron
Tiankai Tu
Carnegie Mellon University, Pittsburgh, PA

 

Abstract

Large datasets, on the order of GB and TB, are increasingly common as abundant computational resources allow practitioners to collect, produce and store data at higher rates. As dataset sizes grow, it becomes more challenging to interactively manipulate and analyze these datasets due to the large amounts of data that need to be moved and processed. Application-independent caches, such as operating system page caches and database buffer caches, are present throughout the memory hierarchy to reduce data access times and alleviate transfer overheads. We claim that an application-aware cache with relatively modest memory requirements can effectively exploit dataset structure and application information to speed access to large datasets. We demonstrate this idea in the context of a system named the tree cache, to reduce query latency to large octree datasets by an order of magnitude.

BibTeX entry

@inproceedings	{ tree-cache-sc2004,
  author	= "Julio Lopez and David O'Hallaron and Tiankai Tu",
  title		= "Big Wins with a Small Application-Aware Cache",
  booktitle	= "Proceedings of {IEEE/ACM} Supercomputing 2004 {SC2004}",
  organization	= "{IEEE}",
  year		= "2004",
  address	= "Pittsburgh, PA, USA",
  month		= "Nov"
}