The old Computer Science story: trade space for time1.
In order to improve program efficiency, we may use lazy evaluation (MECpp item 17), which is a technique for improving the efficiency of programs where results are not always needed. On the other side, when we must support operations whose results are almost always needed or whose results are often needed more than once, we may adopt “over-eager evaluation to amortize the cost of anticipated computations, such as caching and prefetching.
Caching
Say we’re writing a program to provide information about employees, and one of the pieces of information we expect to request frequently is an employee’s cubicle number, which is stored in a database, but the database is not optimized to find it. In this case, we could cache the cubicle numbers to save the subsequent database lookups.
|
|
Prefetching
According to the infamous locality of reference phenomenon, if data in one place is requested, it’s quite common to want nearby data, too, which justifies disk caches, memory caches for both instructions and data, and instruction prefetches.
Adopting similar concept, we can use similar strategy when writing a template for dynamic arrays, which will automatically extend themselves:
|
|
This operator[]
function allocates twice as much memory as needed each time the array must be extended, so that it saves one memory allocation when its logical size is extended twice in the following case:
|
|
-
Not always. Using large objects means fewer fit on a virtual memory or cache page. In rare cases, making objects bigger reduces the performance of the software due to the increased paging activity and/or the decreased cache hit rate. ↩︎