# Lab 12: Iterators, Generators, and Streams

*Due at 11:59pm on Tuesday, 8/1/2017.*

## Starter Files

Download lab12.zip. Inside the archive, you will find starter files for the questions in this lab, along with a copy of the OK autograder.

## Submission

By the end of this lab, you should have submitted the lab with
`python3 ok --submit`

. You may submit more than once before the
deadline; only the final submission will be graded.

- To receive credit for this lab, you must complete Questions 1, 2, 3, 4, 5, 6, and 7 in lab12.py and submit through OK.
- Questions 8, 9, 10, and 11 are
*extra practice*. They can be found in the lab12_extra.py file. It is recommended that you complete these problems on your on time.

## Iterables and Iterators

In lecture, we studied several Python object interfaces, or *protocols*. In this
lab, we will study a new protocol, the *iterator protocol*. Implementing this
protocol allows us to use our objects in for loops! Remember the `for`

loop?
(We really hope so!)

```
for elem in something_iterable:
# do something
```

`for`

loops work on any object that is *iterable*. We previously described it as
working with any sequence -- all sequences are iterable, but there are other
objects that are also iterable! As it turns out, for loops are actually
translated by the interpreter into the following code:

```
the_iterator = iter(something_iterable)
try:
while True:
elem = next(the_iterator)
# do something
except StopIteration:
pass
```

That is, it first calls the built-in `iter`

function to create an *iterator*,
saving it in some new, hidden variable (we've called it `the_iterator`

here).
It then repeatedly calls the built-in `next`

function on this iterator to get
values of `elem`

and stops when that function raises `StopIteration`

.

### Question 1: WWPD: Iterators

What would Python display? Try to figure it out before you type it into the interpreter!

Use OK to test your knowledge with the following "What Would Python Display?" questions:

`python3 ok -q iterators -u`

```
>>> s = [1, [2, [3, [4]]]]
>>> t = iter(s)
>>> next(t)
______1
>>> next(iter(next(t)))
______2
>>> list(t)
______[]
>>> next(map(lambda x: x * x, filter(lambda y: y > 4, range(10))))
______25
>>> tuple(map(abs, reversed(range(-6, -4))))
______(5, 6)
>>> r = reversed(range(10000))
>>> next(r) - next(r)
______1
>>> xs = [2, 3, 4, 5]
>>> y, z = iter(xs), iter(xs)
>>> next(zip(y, z))
______(2, 2)
>>> next(zip(y, y))
______(3, 4)
```

## Generators

A generator function returns a special type of **iterator** called a
**generator** object. Generator functions have `yield`

statements within the
body of the function. Calling a generator function makes it return a generator
object rather than executing the body of the function.

The reason we say a generator object is a special type of **iterator** is that
it has all the properties of an iterator, meaning that:

- Calling the
`__iter__`

method makes a generator object return itself without modifying its current state. - Calling the
`__next__`

method makes a generator object compute and return the next object in its sequence. If the sequence is exhausted,`StopIteration`

is raised. - Typically, a generator should not restart unless it's defined that way. But
calling the generator function returns a brand new generator object (like
calling
`__iter__`

on an iterable object).

However, they do have some fundamental differences:

- An iterator is a class with
`__next__`

and`__iter__`

explicitly defined, but a generator can be written as a mere function with a`yield`

in it. `__next__`

in an iterator uses`return`

, but a generator uses`yield`

.A generator "remembers" its state for the next

`__next__`

call. Therefore,the first

`__next__`

call works like this:- Enter the function, run until the line with
`yield`

. - Return the value in the
`yield`

statement, but remember the state of the function for future`__next__`

calls.

- Enter the function, run until the line with
And subsequent

`__next__`

calls work like this:- Re-enter the function, start at
**the line after**, and run until the next`yield`

`yield`

statement. - Return the value in the
`yield`

statement, but remember the state of the function for future`__next__`

calls.

- Re-enter the function, start at

When a generator runs to the end of the function, it raises `StopIteration`

.

Another useful tool for generators is the `yield from`

statement (introduced in
Python 3.3). `yield from`

will yield all values from an iterator or iterable.

### Question 2: WWPD: Generators

Use OK to test your knowledge with the following What would Python Display questions:

`python3 ok -q generators -u`

```
def generator():
print("Starting here")
i = 0
while i < 6:
print("Before yield")
yield i
print("After yield")
i += 1
```

```
>>> g = generator()
>>> g # what type of object is this?
______<generator object>
>>> g == iter(g) # equivalent of g.__iter__()
______True
>>> next(g) # equivalent of g.__next__()
______Starting here
Before yield
0
>>> next(g)
______After yield
Before yield
1
>>> next(g)
______After yield
Before yield
2
```

### Question 3: Countdown

Write a generator function that counts down to 0.

```
def countdown(n):
"""
A generator that counts down from N to 0.
>>> for number in countdown(5):
... print(number)
...
5
4
3
2
1
0
>>> for number in countdown(2):
... print(number)
...
2
1
0
"""
"*** YOUR CODE HERE ***"
while n >= 0:
yield n
n = n - 1
```

Use OK to test your code:

`python3 ok -q countdown`

### Question 4: Trap

Write a generator function that yields the first `k`

values in iterable `s`

, but
raises a `ValueError`

exception if any more values are requested.

```
def trap(s, k):
"""Return a generator that yields the first K values in iterable S,
but raises a ValueError exception if any more values are requested.
>>> t = trap([3, 2, 1], 2)
>>> next(t)
3
>>> next(t)
2
>>> next(t)
ValueError
>>> list(trap(range(5), 5))
ValueError
"""
assert len(s) >= k
"*** YOUR CODE HERE ***"
t = iter(s)
while k > 0:
yield next(t)
k -= 1
raise ValueError("It's a trap!")
```

Use OK to test your code:

`python3 ok -q trap`

### Question 5: Repeated

Implement a function (not a generator function) that returns the first value in
iterable `t`

that appears `k`

times in a row.

```
def repeated(t, k):
"""Return the first value in iterable T that appears K times in a row.
>>> s = [3, 2, 1, 2, 1, 4, 4, 5, 5, 5]
>>> repeated(trap(s, 7), 2)
4
>>> repeated(trap(s, 10), 3)
5
>>> print(repeated([4, None, None, None], 3))
None
"""
assert k > 1
"*** YOUR CODE HERE ***"
count = 0
last_item = None
for item in t:
if item == last_item:
count += 1
else:
last_item = item
count = 1
if count == k:
return item
```

Use OK to test your code:

`python3 ok -q repeated`

## Streams

The `Stream`

class defines a lazy sequence, a lazily
evaluated linked list. In other words, a Stream's elements (except for the
first element) are only evaluated as those values are needed.

```
class Stream:
empty = 'empty'
def __init__(self, first, compute_rest=lambda: Stream.empty):
self.first = first
self.cached_rest = None
assert callable(compute_rest)
self.compute_rest = compute_rest
@property
def rest(self):
"""Return the rest, computing it if necessary."""
if self.compute_rest is not None:
self.cached_rest = self.compute_rest()
self.compute_rest = None
return self.cached_rest
def __repr__(self):
rest = self.cached_rest if self.compute_rest is None else '<...>'
return 'Stream({}, {})'.format(self.first, rest)
```

Instead of specifying all of the elements in `__init__`

, we
provide a function, `compute_rest`

, that encapsulates the algorithm
used to calculate the remaining elements of the stream.
The code in the function body is not evaluated until it is called,
which lets us implement the desired evaluation behavior.

This implementation of streams also uses *memoization*. The first time
a program asks a `Stream`

for its `rest`

field, the `Stream`

code
computes the required value using `compute_rest`

, saves the resulting
value, and then returns it. After that, every time the `rest`

field is
referenced, the stored value is simply returned and it is not computed
again.

Here is an example (which you may use in solutions):

```
def make_integer_stream(first=1):
def compute_rest():
return make_integer_stream(first+1)
return Stream(first, compute_rest)
```

We start out with a stream whose first
element is 1, and whose `compute_rest`

function creates another stream.
So when we do compute the `rest`

, we get another stream whose first
element is one greater than the previous element, and whose
`compute_rest`

creates another stream. Hence, we effectively get an
infinite stream of integers, computed one at a time. This is almost
like an infinite recursion, but one which can be viewed one step at a
time, and so does not crash.

Another example:

```
def map_stream(fn, s):
if s is Stream.empty:
return s
return Stream(fn(s.first), lambda: map_stream(fn, s.rest))
```

### Question 6: Ones

Define a stream `ones`

that creates an infinite stream of ones.

```
ones = None
ones = Stream(1, lambda: ones)
def ones_test():
"""
>>> ones.first, ones.rest.first, ones.rest.rest.first, ones.rest.rest.rest.first
(1, 1, 1, 1)
"""
```

Use OK to test your code:

`python3 ok -q ones_test`

### Question 7: Scan

Define a function `scan`

, which takes in a two argument function `f`

,
an initial value `a0`

, and an infinite stream with elements

`a1, a2, a3, ...`

and outputs the stream

`a0, f(a0, a1), f(f(a0, a1), a2), f(f(f(a0, a1), a2), a3), ...`

```
def scan(f, initial_value, stream):
"""
>>> ones = Stream(1, lambda: ones)
>>> naturals = scan(lambda x, y: x + y, 1, ones)
>>> _ = naturals.rest.rest.rest
>>> naturals
Stream(1, Stream(2, Stream(3, Stream(4, <...>))))
>>> factorials = scan(lambda x, y: x * y, 1, naturals)
>>> _ = factorials.rest.rest.rest.rest
>>> factorials
Stream(1, Stream(1, Stream(2, Stream(6, Stream(24, <...>)))))
"""
"*** YOUR CODE HERE ***"
return Stream(initial_value, lambda: scan(f, f(initial_value, stream.first), stream.rest))
```

Use OK to test your code:

`python3 ok -q scan`

## Extra Questions

### Question 8: Scale

Implement the generator function `scale(s, k)`

, which yields elements of the
given iterable `s`

, scaled by `k`

.

```
def scale(s, k):
"""Yield elements of the iterable s scaled by a number k.
>>> s = scale([1, 5, 2], 5)
>>> type(s)
<class 'generator'>
>>> list(s)
[5, 25, 10]
>>> m = scale(naturals(), 2)
>>> [next(m) for _ in range(5)]
[2, 4, 6, 8, 10]
"""
"*** YOUR CODE HERE ***"
for elem in s:
yield elem * k
```

Use OK to test your code:

`python3 ok -q scale`

### Question 9: Remainder Generator

Like functions, generators can also be*higher-order*. For this problem, we will be writing

`remainders_generator`

, which yields a
series of generator objects.
`remainders_generator`

takes in an integer `m`

, and yields `m`

different
generators. The first generator is a generator of multiples of `m`

, i.e. numbers
where the remainder is 0. The second, a generator of natural numbers with
remainder 1 when divided by `m`

. The last generator yield natural numbers with
remainder `m - 1`

when divided by `m`

.

```
def remainders_generator(m):
"""
Takes in an integer m, and yields m different remainder groups
of m.
>>> remainders_mod_four = remainders_generator(4)
>>> for rem_group in remainders_mod_four:
... for _ in range(3):
... print(next(rem_group))
0
4
8
1
5
9
2
6
10
3
7
11
"""
"*** YOUR CODE HERE ***"
def remainder_group(rem):
start = rem
while True:
yield start
start += m
for rem in range(m):
yield remainder_group(rem)
```

Note that if you have implemented this correctly, each of the
generators yielded by `remainder_generator`

will be *infinite* - you
can keep calling `next`

on them forever without running into a
`StopIteration`

exception.

**Hint**: Consider defining an inner generator function. What arguments
should it take in? Where should you call it?

Use OK to test your code:

`python3 ok -q remainders_generator`

### Question 10: Zip generator

For this problem, we will be writing `zip_generator`

, which yields a
series of lists, each containing the nth items of each iterable.
It should stop when the smallest iterable runs out of elements.

```
def zip(*iterables):
"""
Takes in any number of iterables and zips them together.
Returns a generator that outputs a series of lists, each
containing the nth items of each iterable.
>>> z = zip([1, 2, 3], [4, 5, 6], [7, 8])
>>> for i in z:
... print(i)
...
[1, 4, 7]
[2, 5, 8]
"""
"*** YOUR CODE HERE ***"
iterators = [iter(iterable) for iterable in iterables]
while True:
yield [next(iterator) for iterator in iterators]
```

Use OK to test your code:

`python3 ok -q zip`

### Question 11: Convolve Streams

Define a function `convolve_streams`

, which takes in two
streams `a`

and `b`

, each of which represents a polynomial
(the nth term represents the coefficient of `x^n`

in the
polynomial). For example, if we have the streams

`2, 3, 1, 0, 0, 0, 0...`

and

`6, -1, 0, 0, 0, 0...`

`convolve_streams`

should output the stream

`12, 16, 3, -1, 0, 0, 0...`

Hint: note that `(a0 + a1 * x + a2 * x^2...)b(x) = a0 b(x) + (a1 * x + a2 * x^2...)b(x)`

```
def convolve_streams(a, b):
"""
>>> zeros = Stream(0, lambda: zeros)
>>> a = Stream(2, lambda: Stream(3, lambda: Stream(1, lambda: zeros)))
>>> b = Stream(6, lambda: Stream(-1, lambda: zeros))
>>> c = convolve_streams(a, b)
>>> _ = c.rest.rest.rest.rest
>>> c
Stream(12, Stream(16, Stream(3, Stream(-1, Stream(0, <...>)))))
"""
"*** YOUR CODE HERE ***"
def add_streams(u, v):
return Stream(u.first + v.first, lambda: add_streams(u.rest, v.rest))
def scale_stream(k, u):
return Stream(k * u.first, lambda: scale_stream(k, u.rest))
return add_streams(scale_stream(a.first, b), Stream(0, lambda: convolve_streams(a.rest, b)))
```

Use OK to test your code:

`python3 ok -q convolve_streams`