# Fall 2008

[Syllabus] [Homework] [Announcements] [Projects]

Current and previous announcements:
• IMPORTANT: Sahand will not be holding office hours tomorrow (Wednesday 12/03/2008).
• Homework clarification. For HW 5 problem 1, we are designing an accept reject algorithm given a target distribution proportional to f(x)=x(1-x) and a proposed distribution g(x) = 1, where all values are taken over x \in [0,1]. We need to find an M such that f(x) \leq M g(x) \forall x. Of course, there are an infinite number of possible values that M can take. Hence, for part b), we select M such that if the proposed value is 1/2, the probability of accepting that value is 1/2.
• Please note that the Tuesday poster session time has changed to 1pm-3pm.
• You can sign up for the poster session at Poster Signups. It doesn't matter what number you sign up under, just what day. Thanks.

### People

Professor:
Martin Wainwright (wainwrig AT SYMBOL eecs DOT berkeley DOT edu)
Offices: 263 Cory Hall, 3-1978; 421 Evans Hall, 3-1975
Office hours: Tuesday, 11:00 - 12:00, 421 Evans Hall; Thursday, 15:30-16:30, Location: Start after class in 3 LeConte. Afterwards, check 330 Evans (for larger group) or 421 Evans (Prof. Wainwright's office).

TA:
Sahand Negahban (sahand_n AT SYMBOL eecs DOT berkeley DOT edu)
Office hours: Wednesday, 13:30-14:30 611 Soda Hall (Alcove)

TA:
Oleg Mayba (oleg AT SYMBOL stat DOT berkeley DOT edu)
Office hours: Mondays, 10:00-11:00 387 Evans Hall.

### Practical information

Course description: This course is a 3-unit course that provides an introduction to the area of probabilistic models based on graphs. These graphical models provide a very flexible and powerful framework for capturing statistical dependencies in complex, multivariate data. Key issues to be addressed include representation, efficient algorithms, inference and statistical estimation. These concepts will be illustrated using examples drawn from various application domains, including machine learning, signal processing, communication theory, computational biology, computer vision etc.

Outline:
• Basics on graphical models, Markov properties, recursive decomposability, elimination algorithms
• Sum-product algorithm, factor graphs, semi-rings
• Markov properties of graphical models
• Junction tree algorithm
• Chains, trees, factorial models, coupled models, layered models
• Kalman filtering and Rauch-Tung-Striebel smoothing
• Hidden Markov models (HMM) and forward-backward
• Exponential family, sufficiency, conjugacy
• Frequentist and Bayesian methods
• The EM algorithm
• Conditional mixture models, hierarchical mixture models
• Factor analysis, principal component analysis (PCA), canonical correlation analysis (CCA), independent component analysis (ICA)
• Importance sampling, Gibbs sampling, Metropolis-Hastings
• Variational algorithms: mean field, belief propagation, convex relaxations
• Dynamical graphical models
• Model selection, marginal likelihood, AIC, BIC and MDL
• Applications to signal processing, bioinformatics, communication, computer vision etc.

Volume: 3 units

Lectures: 3 LeConte, Tues, Thurs 14:00-15:30.

Section: Wednesday, 17:00-18:00, 330 Evans Hall.