back to the cs61b class homepage
Simon's office hours and contact info
Simon's section
- Meets Tuesday and Thursday 1pm (sharp, not Berkeley time) - 2pm
in Soda 320
- Section 1 - Tues 6/27: welcome, icebreaker
- Collatz.java: A simple program with some looping.
Takes an integer argument on the commandline.
- Dressup.java: A use of arrays of
String objects and nested for loops.
- Point.java: A little class that
implements cartesian points and illustrates the fact that Java
objects are references.
- Section 2 - Thurs 6/29: learning by experiment, object
orientation and invariants
- stuff00.java,
morestuff.java: This code was slapped
together during section to help answer a few
questions about static (class) variables, "this", and inner
classes. This code doesn't do anything important and the names are
particularly ugly. It's included here in case anyone wants to
experiment with it further. "stuff00" should compile. "morestuff" does
not compile for the reasons mentioned in it. The thing to take
from this code is that if you have questions about a language,
often you can answer them by setting up and running an
experiment like these.
- No section on Tuesday 7/4
- Section 3 - Thurs 7/6: Inheritance, exceptions, cryptograms and Caesar ciphers
- Section 4 - Tues 7/11: Go over exam 1.
- Section 5 - Thurs 7/13: Lists, Trees, Two dimensional
arrays (TwoD.java).
- Section 6 - Tues 7/18: Complexity of bubble sort.
- Section 7 - Thurs 7/20: HW3. More on big-O, big-Theta, big-Omega. Quickselect.
- Section 8 - Tues 7/25: HW4. Heaps.
- Tues 8/8: Graphs and a little bit on amortized
analysis. The way to prove that insert into a doubling list takes
amortized constant (theta(1)) time
is to show that k inserts take theta(k) time. The first k inserts
through the first double take 2k time (k for the inserts, k for the
doulbing). Inductively, the k inserts you do from just after a
doubling though the next doubling (k for the inserts, 2k for the
doubling) takes theta(k) time. Thus, insert into a doubling list
takes amortized constant time.
- Thurs 8/10: Mostly Floyd-Warshall, some Bellman-
Ford. A few points about the Bellman-Ford shortest path algorithm:
- |V| - 1 iterations of the outer loop suffices (I had |V|
iterations on the board; that works but wastes time)
- If there are no changes to best paths in an iteration, you can
stop and you're done. This little trick improves the best case
complexity of the algorithm to theta(|E|), but the worst case remains
theta(|V|+|E|)
- I said something that was incorrect. The first iteration of the
outer loop can find paths of arbitrary length (not just one edge
long). It all depends on what order in which you consider the edges.
The reason you need |V|-1 iterations is that if your edge ordering is
unfortunate, you may need that many iterations to find the best path
from the origin to a node.
Simon's lab