When shopping for a computer, often the word “cache” will come up. There are two types of caches when it comes to modern computers: L1 and L2. Some now even have L3 caches. Caching is a very important process when it comes to your PC.
There are memory caches, hardware and software disk caches, page caches and more. Virtual memory is even a form of caching. Let’s look at what caching is and why it is so important. Caching is a technology based on thememory subsystem of your computer.
The main purpose of a cache is to accelerate your computer while keeping the price of the computer low. Caching allows you to do your computer tasks more rapidly. To understand the basic idea behind a cache system, we can use a simple analogy using a librarian to demonstrate the caching process. Think of a librarian behind the desk.
He or she is there to give you the books you ask for. To keep it simple, let’s assume that you can’t get the books yourself, you have to ask the librarian for the book you want to read and he or she gets it for from you from shelving in a storeroom. This first example is a librarian without a cache.
The first person arrives and asks for the book Great Expectations. The librarian goes to the storeroom, gets the book, returns to the counter, and gives the book to the customer. Later, the borrower comes back to return the book. The librarian takes the book and returns it to the storeroom returning to the counter to wait for the next customer.
The next customer comes in and also asks for Great Expectations. The librarian has to return to the storeroom to get the same book he had already handled and give it to the client. So basically, the librarian has to make a complete round trip to fetch every book – even very popular ones that are requested frequently. This isn’t a very efficient system for our librarian, is it?
However, there is a way to improve on this system. We can add a cache on the librarian. To illustrate a cache, let’s give the librarian a backpack into which he or she will be able to store, say, ten books. That would mean the librarian has a 10 book cache. In this backpack, he or she will put the books the customers return to him up to a maximum of ten.
Now, let’s go back and visit the first scenario with our cached librarian. At the beginning of the day, the librarian’s cache is empty. The first person arrives and asks for Great Expectations. So the librarian goes to the storeroom and gives it to the customer.
When the customer return with the book, instead of going back to the storeroom, the librarian puts the book into the backpack making sure it isn’t full first. Another person arrives and asks for Great Expectations.
Before going to the storeroom the librarian checks to see if the book is in the backpack already. Lo and behold, it is! Now all he or she has to do is take the book from the backpack and give it to the client. No extra energy is expended by the librarian, and the customer doesn’t have to wait for that trip to the storeroom.
Let’s assume that the customer asks for a title that’s not in the backpack? In this case, the librarian is less efficient with a cache because he or she must take the time to look for the book in the backpack first.
That is why one of the challenges of cache design is to minimize the impact of cache searches. Modern hardware has reduced this time delay to practically zero.
The time it takes for the librarian to look in the cache is much less than having to run to the storeroom, so time is saved automatically with a cache. The cache is small (just ten books) so the time it takes to notice a miss is only a tiny fraction of the time it takes to walk to the storeroom.
No comments:
Post a Comment