Open Access Open Access  Restricted Access Subscription Access

Comprehensive Review of Data Prefetching Mechanisms


Affiliations
1 University College of Engineering, Punjabi University, Patiala, India
 

The expanding gap between microprocessor and DRAM performance has necessitated the use of some other techniques designed to reduce or hide the latency of main memory accesses. Although large cache hierarchies have proven to be effective in reducing this latency for the most frequently used data, but it is still not so efficient.

This paper proposed a technique i.e. DataPrefetching Technique for hiding the access latency of data referencing patterns that defeat caching strategies. Rather than waiting for a cache miss to initiate a memory fetch, data prefetching anticipates such misses and issues a fetch to the memory system in advance of the actual memory reference. With data prefetching, memory system call data into cache before processor needs it, thereby reducing memory access latency. The following survey examines several alternative approaches and discusses the design tradeoffs involved when implementing a data Prefetch strategy.


Keywords

Prefetching, Memory Latency.
User
Notifications
Font Size

Abstract Views: 141

PDF Views: 0




  • Comprehensive Review of Data Prefetching Mechanisms

Abstract Views: 141  |  PDF Views: 0

Authors

Sneha Chhabra
University College of Engineering, Punjabi University, Patiala, India
Raman Maini
University College of Engineering, Punjabi University, Patiala, India

Abstract


The expanding gap between microprocessor and DRAM performance has necessitated the use of some other techniques designed to reduce or hide the latency of main memory accesses. Although large cache hierarchies have proven to be effective in reducing this latency for the most frequently used data, but it is still not so efficient.

This paper proposed a technique i.e. DataPrefetching Technique for hiding the access latency of data referencing patterns that defeat caching strategies. Rather than waiting for a cache miss to initiate a memory fetch, data prefetching anticipates such misses and issues a fetch to the memory system in advance of the actual memory reference. With data prefetching, memory system call data into cache before processor needs it, thereby reducing memory access latency. The following survey examines several alternative approaches and discusses the design tradeoffs involved when implementing a data Prefetch strategy.


Keywords


Prefetching, Memory Latency.