Orders of Growth

What does it mean to say a procedure takes Θ(N) time? The lecture notes give a pretty formal definition, but the basic idea is that this means "the amount of work I do is proportional to N", where "N" is something like "the number of the items in the input sentence". Thus, if I have twice as many items, I have to do twice as much work. (We also call this "linear" time.) A good example of this is the COUNT procedure, which has to look at every word in the sentence to know how long it is.

How about some other cases? Θ(1), or "constant" time, applies to things that don't care how big the input is. A good example here is FIRST. You can pass in a sentence with one word or a million words, and it will take the same amount of time to get the first word.

Let's try a few practice problems. If it takes 10 minutes for me to process a sentence of N words on my laptop, and I borrow a friend's server that works four times as fast, how long of a sentence can I process in 10 minutes if my algorithm...

Occasionally you will also see things like O(N) (instead of Θ(N)). This just means a procedure's work is at most proportional to the length of its input, i.e. if I put in twice as many words, it can take up to twice as long. Thus, FIRST is Θ(1), which means it is O(1), but it is also O(N), because doubling the length of its input takes less than double time. Conversely, COUNT is O(N) but not O(1), because doubling the length of its input does affect the time. It's just good to know the difference, though sometimes people say O when they mean Θ.

Some more practice! What is the order of growth of each of the following procedures? What is N?

Iteration

Another way to measure the cost of a program is by how much space it requires. For a simple program like the straightforward COUNT, you need space for every recursive call. Why? Because the first call to COUNT isn't done until it can add 1 to the next call to COUNT, which isn't done until...all the way down to the base case. Only once we reach the base case can all the intermediate calls actually release their memory (called "stack space") back to the computer.

If a program is written more carefully, however, it can use a fixed amount of memory, rather than one that grows with the amount of recursion you have. The term for such a program is iterative. For example, here's an iterative version of COUNT:

(define (count sent)
  (define (iter sent count-so-far)
    (if (empty? sent)
        count-so-far
        (iter (bf sent) (+ 1 count-so-far)) ))
  (iter sent 0) )

It's a bit more complicated than the standard COUNT for sure. But there's an important difference. Notice how the call to the helper procedure ITER is the last thing that happens in COUNT. This is called a tail call. Moreover, each recursive call in ITER is also a tail call; this is often called being tail-recursive.

You might think we have the same problem as before: doesn't COUNT still have to wait for ITER to finish before returning its result? Scheme, however, is smart enough to realize that all COUNT is going to do with the value it gets from ITER is return it, and so just tells ITER to return its result to whoever called COUNT. That means that COUNT doesn't have to sit around waiting for ITER, and can give its memory back right away.

If a procedure calls another procedure that isn't iterative, does the first procedure still count as iterative? Not really—the space required to run the first procedure has to include the space for the second. Moral of the story: if every recursive call in a procedure is a tail call, and everything the procedure calls is iterative, that procedure is iterative as well!

Let's try a few examples. Which of the following generate an iterative process, and which generate a recursive (not iterative) process?

Note: "recursive" just means a function that calls itself. Thus in Scheme, all "iterative" functions are recursive! (Other languages use loops.) But when we refer to a recursive process, we’re talking about the trait of requiring more and more memory as each call waits for the next to finish. Iterative processes only require a fixed amount of space.

A few thought questions

Back to Jordy's home page | Course home page