We'll take as our first example a piece of code from my previous post - a call to a function called foldLeft:
To begin, we next need the internals of this foldLeft function too:
With these two pieces of code in hand, we can begin to trace. To do so we ask ourselves, 'what does our starting piece of code become if we replace the call the function, with the elements of the function which will be executed in the call?' So, given this for the first line of our trace:
We end up with a second trace line as follows:
Let's work through this slowly - what happened? It's actually a combination of micro steps. When we make the call, 'as' is List(1,2,3), 'b' is 0, and 'f' is (_ + _). Consequently, the pattern match will trigger on 'case h :: t', and we're left with a new call to foldLeft.
This new foldLeft call will have parameters based on the application of the right hand side of the case statement:
't' is the tail of the original 'as' List (now expressed in terms of '::' (or "Cons") - 2 :: 3 :: Nil,ASIDE: For some (e.g. me as I was learning this technique) this is too big a jump the first time. In this case you can trace out intermediate steps, which will give you:
'b' is now the application of the function f to the original value of 'b' and the value of the head of 'as' - f(0, 1) or (0 + 1),
and 'f' is still 'f' - (_ + _)
Before we continue, applying the technique again on our new foldLeft input there is an important thing to note (and an illustration of a powerful benefit of the tracing technique). Note on the last line that we've *not* collpased the function 'f' - we have '(0 + 1)' rather than '1'. Why not? We have all the information to hand and we could have done the maths and nothing would have broken. The reason we leave it is a convention of tracing - we are aiming simply to reduce all the function calls to their simple equivalents. It is nice to leave the simple elements of a trace explicit until all the calls elsewhere have been made so that you can see the resulting function in the raw, and complete - the one that will be executed when the code runs.
Lets move on. When we left off, we had another foldLeft call to trace:
So lets apply the same logic as last time and see what we get. Again we trigger on the second case statement, giving the following values to the right-hand-side:
't' is the tail of the 'as' in the current context - 3 :: Nil,Mapping this onto the right-hand-side call to foldLeft gives the next line of trace as follows:
'b' is the application of the function f to the current value of b and the head of 'as' - f((0 + 1), 2), or ((0 + 1) + 2),
and again 'f' is still 'f' - (_ + _)
We need to keep going - we still have a foldLeft call to expand. We'll not labour the point of this one (its the same as the previous step). Now we have (I've omitted the micro-steps for clarity and brevity):
This leaves with yet another call to foldLeft, but this time we're going to trigger the first case statement. What happens in this circumstance? We forget about everything else, and simply return b - which in our case is '(((0 + 1) + 2) + 3)'. Let's add it to the bottom as usual:
And with that we've fallen out the bottom of foldLeft - no more calls, just simple maths. Some people stop here - our result is now clear. Others like to add the final line of the trace as the reslt of this equation. Its kind of up to you. I'm not going to bother to save you scrolling even more, and I don't want to patronise you.
Given all this, below are some more traces which you can look at to embed this technique fully. I encourage you to do some of your own. It really helps.
First up is a 'product' function which simply calls foldLeft with a multiplicative function.
Given: def product(l: List[Double]) = foldLeft(l, 1.0) (_ * _)
When: product(List(10.0,23.0,96.0))
Then:
Given: def foldRight[A,B](as: List[A], b: B)(f: (A, B) => B): B =
as match {
case Nil => b
case x :: xs => f(x, foldRight(xs, b)(f))
}
When: foldRight(List(1,2,3), Nil) (List(_,_))
Then:
Given: def foldRight[A,B](as: List[A], b: B)(f: (A, B) => B): B =
as match {
case Nil => b
case x :: xs => f(x, foldRight(xs, b)(f))
}
And: def length[A](as: List[A]): Int = foldRight(as, 0)((_, acc) => acc + 1)
When: length(List (5,6,7))
Then:
Given all this, below are some more traces which you can look at to embed this technique fully. I encourage you to do some of your own. It really helps.
First up is a 'product' function which simply calls foldLeft with a multiplicative function.
Given: def product(l: List[Double]) = foldLeft(l, 1.0) (_ * _)
When: product(List(10.0,23.0,96.0))
Then:
Lets move from foldLefts to a foldRight. This one is from Functional Programming in Scala:
Given: def foldRight[A,B](as: List[A], b: B)(f: (A, B) => B): B =
as match {
case Nil => b
case x :: xs => f(x, foldRight(xs, b)(f))
}
When: foldRight(List(1,2,3), Nil) (List(_,_))
Then:
Finally here's another foldRight. Note that this one is tricksy, due to the slightly unusual higher order function that's used. It's instructive here to see the trace in action. (Note: I've added the function calls but they don't compile. You could fix this by substituting the anonymous function in the example for a named one.)
Given: def foldRight[A,B](as: List[A], b: B)(f: (A, B) => B): B =
as match {
case Nil => b
case x :: xs => f(x, foldRight(xs, b)(f))
}
And: def length[A](as: List[A]): Int = foldRight(as, 0)((_, acc) => acc + 1)
When: length(List (5,6,7))
Then:
Power Tips
Before we close its worth mentioning a few tips to help you get the most from this learning tool:- formatting is your friend - as you can see from the above examples, use spaces liberally to make clear to yourself how things are progressing
- so is your IDE - every trace line should parse and compile, and give the same result (though you might have to replace anonymous functions with named ones.)
- use this when thinking about execution-time, and tail recursion, and design of your own algorithms - sometimes the functional style can be hard to make the jump to when you're coming from an imperative background. This technique helps significantly. By mapping it out line by line you can check and understand your algorithm, see any potentials problems it has, and see why or why not its tail recursive.
No comments:
Post a Comment