Computer Science

Memoization

Memoization is a technique used in computer programming to speed up the execution time of a function by caching its results. When the function is called again with the same input, the cached result is returned instead of recomputing the function. This can significantly improve the performance of the program.

Written by Perlego with AI-assistance

3 Key excerpts on "Memoization"

  • Book cover image for: Functional Programming in C#
    eBook - PDF

    Functional Programming in C#

    Classic Programming Techniques for Modern Projects

    • Oliver Sturm(Author)
    • 2011(Publication Date)
    • Wiley
      (Publisher)
    It becomes possible to use Memoization with functions that are not pure. The first reason is quite obvious. Remembering every single result that a function calculates for the duration of an application run is not always the best strategy. Of course, a particular instance of a function doesn’t have to be memoized for the length of an application run, so scoping can be one approach to keep memory use within bounds. The second reason should be handled with a lot of care because using Memoization with non-pure functions cannot be recommended lightly. But there are cases where such an approach can be useful — for instance, when dealing with persistent data from databases, or data that is being ➤ ➤ received through network connections. In those cases, a caching approach like Memoization is still a promising strategy, when coupled with careful expiry of outdated cache elements. Depending on considerations of the algorithm, an automated cache cleanup can be very useful based, for instance, on a high-water mark applied to the number of elements in the cache or on a certain time that can elapse before an element is regarded as outdated. SUMMARY The need for strategies to remember previous results of computations is not specific to functional programming. However, with lazy evaluation mechanisms and the general guideline of calling functions as much as possible, it is especially important in functional programming to have caching technology easily available. Precomputation and Memoization are two techniques of storing calculated data before and after it’s first needed. Summary ❘ 115 Calling Yourself WHAT’S IN THIS CHAPTER? Recursion in C# Tail Recursion Accumulator Passing Style Continuation Passing Style Indirect Recursion In functional programming languages, recursion is a tool with a lot of tradition. Many of the original functional languages didn’t have any loop constructs, and recursion was used for all cases where looping is typically used in imperative languages.
  • Book cover image for: Higher-Order Perl
    eBook - ePub

    Higher-Order Perl

    Transforming Programs with Programs

    There aren’t too many ideas that are both good and simple. The few that we have are used everywhere. Caching is one of these. Your web browser caches the documents it retrieves from the network. When you ask for the same document a second time, the browser retrieves the cached copy from local disk or memory, which is fast, instead of downloading it again. Your domain name server caches the responses that it receives from remote servers. When you look up the same name a second time, the local server has the answer ready and doesn’t have to carry on another possibly time-consuming network conversation. When your operating system reads data from the disks, it probably caches the data in memory, in case it’s read again; when your CPU fetches data from memory, it caches the data in a special cache memory that is faster than the regular main memory.
    Caching comes up over and over in real programs. Almost any program will contain functions where caching might yield a performance win. But the best property of caching is that it’s mechanical . If you have a function, and you would like to speed it up, you might rewrite the function, or introduce a better data structure, or a more sophisticated algorithm. This might require ingenuity, which is always in short supply. But adding caching is a no-brainer; the caching transformation is always pretty much the same. This:
    turns into this:
    The transformation is almost exactly the same for every function. The only part that needs to vary is the join ‘, ‘, @_ line. This line is intended to turn the function’s argument array into a string, suitable for a hash key. Turning arbitrary values into strings like this is called serialization or marshalling. 2 The preceding join ‘,’, @_ example works only for functions whose arguments are numbers or strings that do not contain commas. We will look at the generation of cache keys in greater detail later on.

    3.4 Memoization

    Adding the caching code to functions is not very much trouble. And as we saw, the changes required are the same for almost any function. Why not, then, get the computer to do it for us? We would like to tell Perl that we want caching behavior enabled on a function. Perl should be able to perform the required transformation automatically. Such automatic transformation of a function to add caching behavior is called Memoization and the function is said to be memoized. 3
    The standard Memoize module, which I wrote, does this. If the Memoize module is available, we do not need to rewrite the fib code at all. We simply add two lines at the top of our program:
  • Book cover image for: C++ Data Structures and Algorithm Design Principles
    No longer available |Learn more

    C++ Data Structures and Algorithm Design Principles

    Leverage the power of modern C++ to build robust and scalable applications

    • John Carey, Shreyans Doshi, Payas Rajan(Authors)
    • 2019(Publication Date)
    • Packt Publishing
      (Publisher)
    For example, finding the solution for F(2) is going to require the same set of calculations, regardless of whether you need it to solve F(4) or F(3). This demonstrates the second defining characteristic of dynamic programming problems, which is known as the optimal substructure. A problem is said to exhibit an optimal substructure when the optimal solution to the overall problem can be formed through some combination of the optimal solutions of its subproblems. For a problem to be solvable using dynamic programming, it must possess these two properties. Because of the overlapping subproblems property, the complexity of these problems tends to increase exponentially as the input increases; however, exploiting the optimal substructure property makes it possible to reduce the complexity significantly. So, in essence, the purpose of DP is to devise a method of caching previous solutions as a means to avoid the repeated calculation of previously solved subproblems. Memoization – The Top-Down Approach No, this is not "memorization," though that would also describe this technique quite accurately. Using Memoization, we can reformulate the top-down solution we described previously to make use of the optimal substructure property exhibited by the Fibonacci sequence. Our program logic will essentially be the same as it was before, only now, after having found the solution at every step, we will cache the results in an array, indexed according to the current value of n (in this problem, n represents the state or set of parameters defining the current recursive branch). At the very beginning of each function call, we will check to see whether we have a solution available in the cache for state F(n)
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.