Type

Text

Type

Dissertation

Advisor

Bender, Michael A. | Johnson, Rob | Gao, Jie | Mitchell, Joseph S.B. | Fineman, Jeremy.

Date

2015-12-01

Keywords

Cache-Adaptive Algorithms, Cache-Oblivious Algorithms, External Memory Algorithms, Memory-Adaptive Algorithms, Page Replacement Policies | Computer science

Department

Department of Computer Science.

Language

en_US

Source

This work is sponsored by the Stony Brook University Graduate School in compliance with the requirements for completion of degree.

Identifier

http://hdl.handle.net/11401/77278

Publisher

The Graduate School, Stony Brook University: Stony Brook, NY.

Format

application/pdf

Abstract

Memory efficiency and locality have substantial impact on the performance of programs, particularly when operating on large data sets. Thus, memory- or I/O-efficient algorithms have received significant attention both in theory and practice. The widespread deployment of multicore machines, however, brings new challenges. Specifically, since the memory is shared across multiple processes, the effective memory-size allocated to each process fluctuates over time. This dissertation studies algorithms in the context of a memory allocation that changes over time, which we call the cache-adaptive setting. The cache-adaptive model applies to operating systems, databases, and other systems where the allocation of memory to processes changes over time. Analytic techniques are provided for studying the behavior of recursive cache-oblivious algorithms in the cache-adaptive model. These techniques make analyzing algorithms in the cache-adaptive model almost as easy as in the Disk Access Model (DAM). These techniques are applied to analyze a wide variety of algorithms --- Master-Method-style algorithms, Akra-Bazzi-style algorithms, collections of mutually recursive algorithms, and algorithms, such as the cache-oblivious FFT algorithm, that break problems of size $N$ into subproblems of size $\Theta(N^c)$. While the cache-oblivious sorting algorithm Lazy Funnel Sort does not have the former recursive structures, it is nonetheless proved to be optimally cache-adaptive. Paging algorithms are studied in the context of cache-adaptive model. It is proved that the Least Recently Used (LRU) policy with resource augmentation is competitive with the optimal offline strategy. Moreover, Belady's Longest Forward Distance (LFD) policy is shown to remain optimal even when the memory size changes. Because cache-oblivious algorithms are well understood, frequently easy to design, and widely deployed, there is hope that provably good cache-adaptive algorithms can be deployed in practice. | 106 pages

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.