L
lscharen
I see the answer now. It must be cached for 'instant' access
to the already computed subparts for it to be efficient.
The dynamic programming approach is actually a good example
for the fib series because it demostrates the weakness and
once you compensate for that by using an if statement:
if O(1) then
here's the answer
else
perform recursion solve of fib(n)(n-1)
it is a good replacement for the classical recursion solution and
(also, it's a good contrast.)
Anyway, those are my thoughts for this morning.
What you wrote is pseudo code is actually a technique called
"memoization". Yes, "memo"-ize, not "memor"-ize. It's the direct way
to make recursive functions efficient by caching values. This is a
top-down method -- you begin with a problem of size n and work down to
the base cases. With dynamic programming, you work bottom-up by
starting with the base case and constructing larger solutions. Here
are the three approaches presented together -- recursive, memoized
recursion and dynamic programming:
// Fibonacci numbers, starting at zero, F(0) = F(1) = 1
////
// A standard recursive method that is
// inefficient
public static int fib_recursive(int n)
{
if (n < 2 )
return 1;
return fib_recursive(n-1) + fib_recursive(n-2);
}
////
// A memoized version that is efficient
// and retains the recursive structure.
public static int memos[];
public static in fib_memoize(int n)
{
memos = new int[n+1];
memos[0] = 1;
memos[1] = 1;
return fib_helper( n );
}
public static int fib_helper(n)
{
// This is your, 'if O(1) then here's the answer'
if ( memos[n] != 0 )
return memos[n];
// Memoize the results
memos[n-1] = fib_helper(n-1);
memos[n-2] = fib_helper(n-2);
return memos[n-1] + memos[n-2];
}
////
// A dynamic programming solution
// that builds the answer up from
// the base cases
public static int fib_dynamic(int n)
{
if ( n < 2 )
return 1;
// define the base cases
int n_2 = 1;
int n_1 = 1;
// A variable to hold the answer
int ans;
// iteratively build up the
// solution. Dynamic programming
// it trivial for this problem
for ( int i = 1; i < n; i++ )
{
// F(n) = F(n-1) + F(n-2);
ans = n_2 + n_1;
// Update the dynamic programming state
n_2 = n_1;
n_1 = ans;
}
return ans;
}
As you can see, the three implementation have different characteristic
Algorithm Time Complexity Space Complexity
--------- --------------- ----------------
Recursive O(1.5^n) O(n)
Memoize O(n) O(n)
Dynamic Prog. O(n) O(1)
The more complicated approaches have superior algorithmic properties,
but we pay for that performance at the cost of a more complex
implementation. Also, I should note that the O(n) space complexity of
the recursive approach comes from the stack usage from each recursive
call. This may not be obvious at a first glance.
-Lucas