Fall 2010: STAT 598L: Probabilistic Graphical Models

# Assignments

## Homework 1

Due: Friday, September 10, 2010, 9:30am
Reading: K&F Chapters 1, 2, 3.1-3.3.2, A.3
Written Exercises: 2.3, 2.5, 2.9, 2.10, 3.1, 3.2, 3.6, 3.7 (optional), 3.8 (optional), 3.9
Programming Exercises: For programming exercises, assume that the input is a binary $N\times N$ adjacency matrix for a graph with $N$ nodes.

1. Implement topological sort algorithm (Algorithm A.1)
1. Test cases: very small ($5\times5$ matrix and the topological order), small ($10\times10$ matrix), medium ($50\times50$ matrix), and large ($1000\times1000$ matrix)
2. 2.22: implement BFS to determine whether a given directed graph is cyclic. Do not try on large graphs. Provide a verbal description of your algorithm.
1. Test cases: very small ($5\times 5$ matrix, ignore the topological order), small ($10\times10$ matrix), dg_medium_loop.zip|smallish medium}} ($15\times15$ matrix), and medium ($50\times50$ matrix)
3. 2.23: implement an algorithm that takes a PDAG and returns the chain components and their ordering. Provide a verbal description of your algorithm and its complexity analysis. Your final output should be a list of components (indices), with components arranged in a topological order.
1. Test cases: medium ($50\times50$ matrix and a list of chain components in a topological order)

Submission instructions and a sample graph and the resulting outputs will be posted here in the near future.

## Homework 2

Due: Monday, October 4, 2010, 9:30am
Written Exercises: 4.1, 4.2, 4.3, 4.5, 4.10, 4.15, 5.1 (extra credit), 5.2 (extra credit), 5.7, 5.12, 7.2
Programming Exercises:

1. 3.27: Implement Algorithm 3.1
2. 3.28

## Homework 3

Due: Monday, November 1, 2010, 9:30am
Reading: K&F Chapters: 8, 16, 17, 18, 20
Written Exercises: 7.7 (be sure to check the errata and errata, especially if you have the first printing!)
Programming Exercises:

• Implement Chow-Liu tree algorithm (paper) for structure and parameter estimation for fully observed categorical valued data sets (chowliu.zip contains a data set of 200 samples from 10d binary variable and the estimated set of parameters and edges)
• Implement iterative scaling for jointly Gaussian variables. Use pairs of variables with a non-zero entry in the concentration matrices as cliques. (gaussian_is.zip contains two files. gaussian.txt contains the baseline 10×10 covariance matrix and the corresponding adjacency matrix for the Markov network. gaussian_is_solution.txt contains the solution covariance and concentration matrices.)

## Homework 4

Due Monday, November 22, 2010, 9:30am
Reading: K&F Chapters 9, 10, 11
Programming Exercises: The goal of the homework is to implement exact inference for a positive Markov random field. Assume that you are given positive factors for a positive Gibbs distribution.

1. Implement maximum cardinality variable elimination ordering (Algorithm 9.3 in the textbook).
2. Implement another heuristic for elimination ordering (your choice of heuristic).
3. Construct maximal cliques using maximum cardinality ordering.
4. Construct a clique tree using a maximum spanning tree (MST) algorithm (see page 375, use either Prim's algorithm in A.2 or Kruskal's).
5. Implement calibration using belief propagation (either Algorithm 10.2 or 10.3) to compute the univariate marginal probabilities for all variables
6. Suppose some of the variables are given as evidence. Implement posterior probability estimation for the rest of univariate posterior marginals (Algorithm 10.3 can easily do that).

Box 10.A and Algorithm 10.A.1 can come handy in implementation.

### Sample files

All four archives contain the file with the factor specification, output file with the marginal probabilities for single variables, and the first three contain intermediate results (adjacency matrix for the Markov network, maximum cardinality ordering, cliques obtained from variable elimination along the maximum cardinality ordering, and the resulting clique tree). pgmExactInference.m contains a high-level script for a potential solution.

For gibbs10 network, gibbs10.zip also contains a file with an output of running pgmExactInference.m on the supplied Gibbs factors:

pgmExactInference('gibbs10.txt',[3 2 1 nan nan nan nan nan nan nan]));

The file with factor specification has the following format:

<number of variables>
<cardinality for var 1> <cardinality for var 2> ... <cardinality for var d>
<number of factors>
<number if vars in factor 1>
<var 1 in factor 1> <var 2 in factor 1> ... <var d_1 in factor 1>
<value 1 for factor 1> <value 2 for factor 1> ... <value |factor 1| for factor 1>
...
<number of vars in factor F>
<var 1 in factor F> <var 2 in factor F> ... <var d_F in factor F>
<value 1 for factor F> <value 2 for factor 1> ... <value |factor 1| for factor F>

For example, file gibbs4.txt contains the following:

4
2 2 2 2
4
2
1 2
1 2 3 4
2
1 4
4 3 2 1
2
2 3
3 2 1 4
2
3 4
1 1 1 1

There are 4 variables (assume $X_1, X_2, X_3, X_4$), each takes on 2 values (assume the values are {0,1}). There are 4 factors:

$X_1$ $X_2$ $\phi_{12}\left(x_1,x_2\right)$
0 0 1
0 1 2
1 0 3
1 1 4
$X_1$ $X_4$ $\phi_{14}\left(x_1,x_4\right)$
0 0 4
0 1 3
1 0 2
1 1 1
$X_2$ $X_3$ $\phi_{23}\left(x_2,x_3\right)$
0 0 3
0 1 2
1 0 1
1 1 4
$X_3$ $X_4$ $\phi_{34}\left(x_3,x_4\right)$
0 0 1
0 1 1
1 0 1
1 1 1