Thursday, January 26

About Cache memory

Cache memory is a temporary memory and usually has a small size than main memory ( RAM ) because it doesn't need to be with bigger capacity to perform well, cache memory is a fast memory that the processor uses for information it is likely to need again in the very near future. A computer processor is very fast and is constantly reading information from main memory, which means it often has to wait for information to arrive, because the main memory access times are slower than the CPU speed. Cache memory works on principles by copying frequently used data into the cache rather than requiring an access to the main memory to retrieve the data. 

Cache can be as unorganized as your author desk or as organized as address book, however, cache memory in a computer differs from our real life in one important way: The computer really has no way to know a priori
( priority ) what data most likely to be accessed, so it uses the locality principle and transfers an entire block from main memory int cache whenever it has to make a main memory access. If the probability of using something else in that block is high, then transferring the entire block saves on access time. The cache location for this new block depends on two things: The cache mapping policy and the cache size.