Information on course projects: EECS 281A / STAT 241A


The course project provides you with an opportunity to explore more deeply a particular aspect of the course that interests you. Of course, the topic of the project should be related to the material (concepts, models, algorithms, applications etc.) discussed in the course, but otherwise you have a fair bit of freedom in selecting a topic.
Possible types of course projects include:
  • a survey project: an overview of several research papers around a coherent theme.

  • an applications research project: demonstrating the application of some techniques from the course in an application of interest (e.g., vision, natural language processing, signal processing, coding, bioinformatics, artificial intelligence, communication, control, neuroscience etc.)

  • a theoretical or methodological research project: examples studying different classes of models; proving convergence guarantees for a known algorithm; developing a new method.

  • Naturally, the expectations are somewhat lower for a research project than a survey project (since research is a capricious thing, and not always easy to predict!) However, in planning your project, you should ensure that it has several stages, and includes at least some goals that you expect to be able to make within the semester. You are free to work either individually, or in groups of 2 to 3 people total.


    The course project is worth roughly 40% of your grade; evaluation will be based upon:
  • A technical write-up describing your project area and results. This write-up should be aimed at a technically knowledgeable audience, but should not assume expertise in your particular area. (It should be readable, for instance, by one of your classmates.) It will be useful to include sections giving the background of your problem, past work (if relevant), and a conclusion in which you state what future directions you might take this work in. You should aim for a write-up that is roughly 10--15 pages. Note that the length per se is not the main concern; rather, it should be the clarity of the presentation.

    Sample projects from previous semesters

    Here are a few examples of projects from previous years. They are not supposed to be the best ones, they're just some pseudorandom sample to give you an idea of what people can do.
    • Graphical Models for Game Theory: A survey of some recent papers, by Ambuj Tewari - report [ps] - poster [ppt] [jpeg]
    • A Naive Bayes Spam Filter, by Kai Wei - report - poster [jpeg]
    • Localization of Robots Using Particle Filters, by Phoebus Chen - poster [jpeg]
    • Nonlinear Dimensionality Reduction on Human Facial Expressions, by Ryan White - report [pdf] - poster [jpeg]
    • Combining SVM with graphical models for supervised classification: an introduction to Max-Margin Markov Networks, by Simon Lacoste-Julien - report - poster [jpeg]
    List of projects for 2002-2003

    Background material

    This section contains a number of suggestions for background reading that could be useful in generating project ideas.

  • Many of these papers are semi-tutorial in nature, and are useful to read to gain a general sense of the field. If you are interested in a more specific topic, then it is worthwhile to follow up on the references in a given paper.

  • This section is a (rapidly) evolving beast. Free free to contribute suggestions or other relevant papers that you found interesting, or think other classmates might find useful.

  • Any of the papers listed here that appear in IEEE journals can be downloaded from the IEEExplore website, accessible for free from any computer on a Berkeley domain. See the proxy instructions at if you want to connect to this service from a computer at home.

  • Any of the papers listed here that appear in most statistics journals (e.g., JASA, Annals of Statistics etc.) can be download from JSTOR at from a computer on the Berkeley domain.

  • General (graphical models; message-passing)

  • Graphical models: Probabilistic inference. M. I. Jordan and Y. Weiss. In The Handbook of Brain Theory and Neural Networks , 2002.

  • Graphical Models M. I. Jordan. General survey paper on graphical models, their applications and algorithms. Appeared in Statistical Science , 2004.

  • Graphical models, exponential families, and variational methods. M. J. Wainwright and M. I. Jordan. Semi-tutorial paper on graphical models, exponential families, and variational methods. Appeared as book chapter in New Directions in Statistical Signal Processing . 2003, 2005.

  • J.S. Yedidia, W. T. Freeman and Y. Weiss. Constructing free energy approximations and generalized belief propagation algorithms. Appeared in IEEE Transactions on Information Theory , Vol. 51, pp. 2282--2312.

    Signal processing

  • A. S. Willsky (2002). Multiresolution Markov models for signal and image processing. Appeared in Proceedings of the IEEE Vol. 90, pp. 1396--1458.

  • Loeliger (2004). An Introduction to Factor Graphs. Appeared in IEEE Signal Processing Magazine , Vol. 21, pp. 28-41.

    Communication and coding

  • F.R. Kschischang et al. (2001). Factor graphs and the sum-product algorithm. Appeared in IEEE Transactions on Information Theory . Vol. 47, pp. 498--519.

  • S.M. Aji and R.J. McEliece (2000), The Generalized Distributive Law. Appeared in IEEE Transactions on Information Theory . Vol. 46, pp. 325--343.

    Markov chain Monte Carlo

  • S. Geman and D. Geman (1984). Stochastic Relaxation, {Gibbs} Distributions, and the {Bayesian} Restoration of Images. Appeared in IEEE Transactions on Pattern Analysis and Machine Intelligence Vol. 6, pp. 721--741.

  • J. Besag, P. Green, D. Higdon and K. Mengersen (1995). Bayesian computation and stochastic systems, Appeared in Statistical Science . Vol. 10, pp. 3--41.

    Natural language processing

  • S. Vogel, H. Ney, and C. Tillmann. HMM-based word alignment in statistical translation. In Proceedings of the 16th conference on Computational linguistics, pp. 836-841, Morristown, NJ, USA, 1996. Association for Computational Linguistics.


  • M.I. Jordan. Chapter 23 of book in preparation. (See the course reading list.)

    Computer vision; image processing

  • S. Geman and D. Geman (1984). Stochastic Relaxation, {Gibbs} Distributions, and the {Bayesian} Restoration of Images. Appeared in IEEE Transactions on Pattern Analysis and Machine Intelligence Vol. 6, pp. 721--741.

  • W. T. Freeman, E. C. Pasztor and O. T. Carmichael (2000). Learning Low-Level Vision. Appeared in International Journal of Computer Vision , Vol. 40, pp. 25--47.

  • K. Murphy, A. Torralba, and W. T. Freeman, Using the forest to see the trees: a graphical model relating features, objects, and scenes, in Advances in Neural Information Processing Systems 16 (NIPS), Vancouver, BC, MIT Press, 2004
  • Neuroscience

    Last modified: 09/01/2008.