Tuesday, 30 September 2014

Cache Memory






In computing, a cache (/ˈkæʃ/ KASH)[1] is a component that transparently stores data so that future requests for that data can be served faster. The data that is stored within a cache might be values that have been computed earlier or duplicates of original values that are stored elsewhere. If requested data is contained in the cache (cache hit), this request can be served by simply reading the cache, which is comparatively faster. Otherwise (cache miss), the data has to be recomputed or fetched from its original storage location, which is comparatively slower. Hence, the greater the number of requests that can be served from the cache, the faster the overall system performance becomes.


It is a special high-speed storage mechanism. It can be either a reserved section of main memory or an independent high-speed storage device. Two types of caching are commonly used in personal computers: memory caching and disk caching.


A memory cache, sometimes called a cache store or RAM cache, is a portion of memory made of high-speed static RAM (SRAM) instead of the slower and cheaper dynamic RAM (DRAM) used for main memory. Memory caching is effective because most programs access the same data or instructions over and over. By keeping as much of this information as possible in SRAM, the computeravoids accessing the slower DRAM.
Some memory caches are built into the architecture of microprocessors. The Intel 80486 microprocessor, for example, contains an 8K memory cache, and the Pentium has a 16K cache. Such internal caches are often called Level 1 (L1) caches. Most modern PCs also come with external cache memory, called Level 2 (L2) caches. These caches sit between the CPUand the DRAM. Like L1 caches, L2 caches are composed of SRAM but they are much larger.


Disk caching works under the same principle as memory caching, but instead of using high-speed SRAM, a disk cache uses conventional main memory. The most recently accessed data from the disk (as well as adjacent sectors) is stored in a memory buffer. When a program needs to access data from the disk, it first checks the disk cache to see if the data is there. Disk caching can dramatically improve the performance of applications, because accessing a byte of data in RAM can be thousands of times faster than accessing a byte on a hard disk.
When data is found in the cache, it is called a cache hit, and the effectiveness of a cache is judged by its hit rate. Many cache systems use a technique known as smart caching, in which the system can recognize certain types of frequently used data. The strategies for determining which information should be kept in the cache constitute some of the more interesting problems in computer science.


Cache, Internet browser cache, or Temporary Internet Files with Internet Explorer is used to improve how fast data loads while browsing the Internet. In most cases, each time you open a web page, the page and all its files are sent to the browser's temporary cache on the hard drive. If that page or file contained on that page (e.g. a picture) needs to load again and has not been modified, the browser opens the page from your cache instead of downloading the page again. Cache saves you lots of time, especially if you use a modem, and can also help save the website owner on bandwidth.

A cache server is a computer or network device that has been setup to store web pages that have been accessed by users on a network. Any user trying to access a web page stored on the cache server is sent the stored version, instead of downloading the web page again. Cache servers help reduce network and Internet traffic congestion, as well as save the company on bandwidth costs.