Recursion Usage and Concepts - Newbie Question

L

lscharen

I see the answer now. It must be cached for 'instant' access
to the already computed subparts for it to be efficient.
The dynamic programming approach is actually a good example
for the fib series because it demostrates the weakness and
once you compensate for that by using an if statement:

if O(1) then
here's the answer
else
perform recursion solve of fib(n)(n-1)

it is a good replacement for the classical recursion solution and
(also, it's a good contrast.)

Anyway, those are my thoughts for this morning. :)

What you wrote is pseudo code is actually a technique called
"memoization". Yes, "memo"-ize, not "memor"-ize. It's the direct way
to make recursive functions efficient by caching values. This is a
top-down method -- you begin with a problem of size n and work down to
the base cases. With dynamic programming, you work bottom-up by
starting with the base case and constructing larger solutions. Here
are the three approaches presented together -- recursive, memoized
recursion and dynamic programming:

// Fibonacci numbers, starting at zero, F(0) = F(1) = 1

////
// A standard recursive method that is
// inefficient
public static int fib_recursive(int n)
{
if (n < 2 )
return 1;

return fib_recursive(n-1) + fib_recursive(n-2);
}


////
// A memoized version that is efficient
// and retains the recursive structure.
public static int memos[];
public static in fib_memoize(int n)
{
memos = new int[n+1];
memos[0] = 1;
memos[1] = 1;

return fib_helper( n );
}

public static int fib_helper(n)
{
// This is your, 'if O(1) then here's the answer'
if ( memos[n] != 0 )
return memos[n];

// Memoize the results
memos[n-1] = fib_helper(n-1);
memos[n-2] = fib_helper(n-2);

return memos[n-1] + memos[n-2];
}


////
// A dynamic programming solution
// that builds the answer up from
// the base cases
public static int fib_dynamic(int n)
{
if ( n < 2 )
return 1;

// define the base cases
int n_2 = 1;
int n_1 = 1;

// A variable to hold the answer
int ans;

// iteratively build up the
// solution. Dynamic programming
// it trivial for this problem
for ( int i = 1; i < n; i++ )
{
// F(n) = F(n-1) + F(n-2);
ans = n_2 + n_1;

// Update the dynamic programming state
n_2 = n_1;
n_1 = ans;
}

return ans;
}


As you can see, the three implementation have different characteristic

Algorithm Time Complexity Space Complexity
--------- --------------- ----------------
Recursive O(1.5^n) O(n)
Memoize O(n) O(n)
Dynamic Prog. O(n) O(1)

The more complicated approaches have superior algorithmic properties,
but we pay for that performance at the cost of a more complex
implementation. Also, I should note that the O(n) space complexity of
the recursive approach comes from the stack usage from each recursive
call. This may not be obvious at a first glance.

-Lucas
 
T

Taria

With dynamic programming, you work bottom-up by
starting with the base case and constructing larger
solutions.

Question here: So if dynamic programming works from the bottom up,
then it really is simliar to an iterative approach, isn't it? So, is
the only difference here is it uses a recursive call to compute and
caches that instead of accumulating the answer into a variable?

And if I were to compare the two: iterative vs dynamic programming,
the iterative approach would seem 1) more readable, and 2) same time
complexity. I can't even begin to figure out what the space overhead
would be for an iterative approach. But if I were to exercise my
newly formed idea of what it would be, the only things it needs to
remember is the sum. So my guess would be O(1). (I am afraid to post
what I think are answers to problems for fear of getting 'shot!' :p
but I'll take a chance this time because mistakes are the best to
learn from.)
As you can see, the three implementation have different characteristic

Algorithm Time Complexity Space Complexity
--------- --------------- ----------------
Recursive O(1.5^n) O(n)
Memoize O(n) O(n)
Dynamic Prog. O(n) O(1)
Another question here...why is the space complexity only O(1) for the
dynamic programming? Does that represent where the answers are stored
at?

Dynamic programming is a subject that will be covered this week, I'm
quite curious about it now.

-t
 
P

Patricia Shanahan

Taria said:
Question here: So if dynamic programming works from the bottom up,
then it really is simliar to an iterative approach, isn't it? So, is
the only difference here is it uses a recursive call to compute and
caches that instead of accumulating the answer into a variable?

Dynamic programming can be done with no recursion. The distinguishing
feature is that dynamic programming algorithms work by building up
subproblem solutions in an organized fashion.

Patricia
 
L

lscharen

Question here: So if dynamic programming works from the bottom up,
then it really is simliar to an iterative approach, isn't it? So, is
the only difference here is it uses a recursive call to compute and
caches that instead of accumulating the answer into a variable?

And if I were to compare the two: iterative vs dynamic programming,
the iterative approach would seem 1) more readable, and 2) same time
complexity. I can't even begin to figure out what the space overhead
would be for an iterative approach. But if I were to exercise my
newly formed idea of what it would be, the only things it needs to
remember is the sum. So my guess would be O(1). (I am afraid to post
what I think are answers to problems for fear of getting 'shot!' :p
but I'll take a chance this time because mistakes are the best to
learn from.)

Dynamic programming *is* the iterative solution. Also, dynamic
programming and recursion are opposite ways of solving the same
problem, so when you say "the only difference here is it (dynamic
programming) uses a recursive call to compute", that is confusion the
two approaches. Take another look at the dynamic programming solution
to the fibonacci sequence; you can see that it *never* calls itself
(no calls to fib_dynamic), so it does not perform *any* recursion at
all.

As far as the space complexity, I'm afraid I might have skipped ahead
too far and muddled the relationship between memoization and dynamic
programming. Let me try again. Compare these two methods and assume
that the array cached[] contains enough space to store n values.

public static int fib_memoize(int n)
{
// If we have already solved this subproblem,
// just return the answer immediately.
if ( cached[n] != 0 )
return cached[n];

// Memoize the results. Notice that we started
// with a solution of size n and are saving the
// solutions to the subproblem. We are
// recursively invoking fib_memoize
cached[n-1] = fib_memoize(n-1);
cached[n-2] = fib_memoize(n-2);

return cached[n-1] + cached[n-2];
}

public static int fib_dynamic(int n)
{
// Define the base cases. Think of
// this as solving the two smallest
// problems
cached[0] = 1;
cached[1] = 1;

// Build the solution by dynamic
// programming. Notice we do not
// call fib_dynamic at any point, so
// this is *not* a recursive solution
for ( int i = 2; i <= n; i++ )
cached[n] = cached[n-1] + cached[n-2];

return cached[n]
}

We can write the same functions for factorial, too. Memoization is
not needed in this case to make the recursive solution efficient. The
recursive factorial will use O(n) space because of the n calls to
itself. The dynamic programming solution will use O(n) space as it
fill up the array.

public static int factorial(int n)
{
if ( n == 0 )
return 1;

return n * factorial(n - 1);
}

public static int factorial_dynamic(int n)
{
// "solve" the base case
cached[0] = 1;

// build up the solution from the smallest problem to
// largest problem
for ( int i = 1; i <= n; i++ )
cached = i * cached[i-1];

return cached;
}

Of course the dynamic programming solution can be optimized because
computing the next factorial value requires only the previous value.
Thus we can write the factorial function to use only O(1) space. The
same approach is used for the fibonacci dynamic programming solution.

public static int factorial_dynamic_2(int n)
{
// "solve" the base case (n = 0) and make
// it the current solution
int solution = 1;

// build up the solution from the smallest problem to
// largest problem
for ( int i = 1; i <= n; i++ )
solution = i * solution;

return solution;
}
Another question here...why is the space complexity only O(1) for the
dynamic programming? Does that represent where the answers are stored
at?

Hopefully the new examples makes this clear. There is no inherent
reason for dynamic programming to be more space efficient than
recursion. In the specific case of the Fibonacci sequence, we only
need the last two values in order to calculate the new value, so the
space complexity can be improved from O(n) (the size of the cached[]
array), to O(1) (the space for holding 2 values). Consider the O(1)
dynamic programming solution to be an optimization.

-Lucas
 
T

Taria

Hopefully the new examples makes this clear. There is no inherent
reason for dynamic programming to be more space efficient than
recursion. In the specific case of the Fibonacci sequence, we only
need the last two values in order to calculate the new value, so the
space complexity can be improved from O(n) (the size of the cached[]
array), to O(1) (the space for holding 2 values). Consider the O(1)
dynamic programming solution to be an optimization.

-Lucas- Hide quoted text -

- Show quoted text -

Your explanations have helped make the concept of DP clearer. Thanks
a million!

However, in the lecture I just had today, dynamic programming was
intertwined with the notion of optimal binary search trees. :x

I tried hard to fit the defintion of dp into the lecture, but couldn't
spend too much time doing that because there was a lot of focus on how
to derive the numerical value of the optimal path (?) of an optimal
binary tree given X amt of nodes with their probabilities attached.
Thankfully, the professor tried to make it easier by not including the
dummy nodes that Cormen mentions. In the book, Cormen explains how he
uses dp by creating a table that contains the root and an e (something
i have no clue what he's referring to, not yet anyway) once he
calculates the values between this node and that node. (that part
makes sense)

Since I have a few days free of homework, I'm going to write a
iterative solving program that will do this for me and then see if I
can do it rcursively, then try to add in the stored values lookup
table for values already calculated. It doesn't seem hard right now,
but I won't know till I try it. :p

I appreciate all the help given to me in the past posts, btw.

-t
 
B

bugbear

Eric said:
Recursion refers to a routine that calls itself, as with the classic Fibonacci
sequence method:

public static int fibonacci( int n )
{
if ( n < 0 ) return 0;
if ( n == 0 || n == 1 )
{
return 1;
}
return fibonacci( n - 1 ) + fibonacci( n - 2 );
}

Notice that fibonacci() calls itself, unless certain conditions are met. If
those conditions didn't exist, the recursion would never end.

Notice also that the function does exactly as I
described: It decomposes the maxi-problem of finding
fibonacci(n) into the mini-problems of fibonacci(n-1)
and fibonacci(n-2), then it combines the solutions of
those smaller problems to reach its own solution. QED.

As an aside, I have always been appalled at the use
of Fibonacci numbers to illustrate recursion, because
recursion is probably the worst[*] way to compute them.
The obvious iterative solution executes in O(n) time, and
there's also an O(1) method, albeit with some drawbacks.
Why would anyone in his right mind choose the O(e^n)
recursive method? Oh, yeah: Because he's a classroom
drudge trying to teach recursion, and it's the only
problem that comes to mind ...

Before dismissing recursion as a solution
to fibonacci, read this page:

http://www.csse.monash.edu.au/~lloyd/tildeAlgDS/Recn/Binary/

I found it highly instructive.

BugBear
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,999
Messages
2,570,243
Members
46,836
Latest member
login dogas

Latest Threads

Top