Tuesday, April 14, 2009

More on MCMC in Python

Markov-Chain Monte Carlo (MCMC) seems to be a promising technique for the calibration/imaging problem that we are facing with our experiment the Precision Array for Probing the Epoch of Reionization (PAPER). Yesterday, in addition to taking a crash-course in MCMC, I also started playing with PyMC, which implements, among other things, MCMC using Metropolis-Hastings chains. A first shot at a simple fitter using PyMC went something like this:

import pymc, numpy as n, pylab as p
from pymc import deterministic

x = n.arange(-10., 10, .01)

def actual_func(a, b, c): return a*x**2 + b*x + c

sig = .1
tau = 1/sig**2
noise = n.random.normal(0, scale=sig, size=x.size)
mdata = actual_func(1., 2., 3.)
mdata += noise

a_fit = pymc.Uniform('a', 0., 2.)
b_fit = pymc.Uniform('b', 1., 3.)
c_fit = pymc.Uniform('c', 2., 4.)
tau_fit = pymc.Uniform('tau', tau/3, 3*tau)

@deterministic
def func(a=a_fit, b=b_fit, c=c_fit): return actual_func(a, b, c)

mdata_var = pymc.Normal('mdata', mu=func, tau=tau_fit,
value=mdata, observed=True)

mdl = pymc.Model([a_fit, b_fit, c_fit, d_fit, mdata_var, tau_fit])
mc = pymc.MCMC(mdl)
mc.sample(iter=1000,burn=250)

a_hist, a_edges = n.histogram(a_fit.trace(), 40, (0,2))
a_bins = (a_edges[:-1] + a_edges[1:])/2
p.plot(a_bins, a_hist)

p.show()


I chose this example over the ones that came with PyMC because it was much closer to the kind of calibration problems we will be trying to solve with PAPER. An interesting point that came up early on was the reliance of MCMC on an estimate of noise levels in the data. I remember this from the Maximum Entropy Method (MEM) for deconvolution that I coded up in AIPY. An interesting technique, used here, is to actually include the noise level (tau_fit) as a variable that is fit for. This way you can specify a starting guess for noise, along with a confidence interval, and not have to pull a magic number out of a hat. In this example, the fitter does a good job of accurately determining noise levels. I think what happens in this code is that the current guess for the noise level is used as part of the calculation that determines the next state to jump to in the Markov chain, and that new state make include a revision of the noise level. This clearly might be instable for more complex systems, so I imagine some amount of care must be exercised in leaving noise as a free parameter.

1 comment:

  1. Hi Aaron,

    Just noticed your post; I am one of the PyMC developers. I've been looking for some calibration examples, and was wondering if you had any other toy (or more realistic) calibration models you could share. If so, please drop me a line at (my last name) at gmail.

    Regards,
    Chris Fonnesbeck

    ReplyDelete