Skip to content Skip to sidebar Skip to footer

Summing Rvs Using Pymc3

I am attempting to implement the model from the image. I am new to PyMC3 and I am not sure how to structure the model correctly. My attempt is below: # sample data logprem = np.arr

Solution 1:

The correct shape for C is (W,D) and since underlying this all is a Theano computational graph on Tensor objects, it is best to avoid loops and restrict oneself to theano.tensor operations. Here is an implementation along such lines:

import numpy as np
import pymc3 as pm
import theano.tensor as tt

D = 10
W = 10# begin with (W,1) vector, then broadcast to (W,D)
logprem = tt._shared(
    np.array(
        [[8.66768002, 8.49862181, 8.60410456, 8.54966038, 8.55910259,
          8.56216656, 8.51559191, 8.60630237, 8.56140145, 8.50956416]]) \
      .T \
      .repeat(D, axis=1))

with pm.Model() as model:
    logelr = pm.Normal('logelr', -0.4, np.sqrt(10))

    # col vector
    alpha = pm.Normal("alpha", 0, sigma=np.sqrt(10), shape=(W-1,1))

    # row vector
    beta = pm.Normal("beta", 0, sigma=np.sqrt(10), shape=(1,D-1))

    # prepend zero and broadcast to (W,D)
    alpha_aug = tt.concatenate([tt.zeros((1,1)), alpha], axis=0).repeat(D, axis=1)

    # append zero and broadcast to (W,D)
    beta_aug = tt.concatenate([beta, tt.zeros((1,1))], axis=1).repeat(W, axis=0)

    # technically the indices will be reversed# e.g., a[0], a[9] here correspond to a_10, a_1 in the paper, resp.
    a = pm.Uniform('a', 0, 1, shape=D)

    # Note: the [::-1] sorts it in the order specified # such that (sigma_0 > sigma_1 > ... )
    sigma = pm.Deterministic('sigma', tt.extra_ops.cumsum(a)[::-1].reshape((1,D)))

    # now everything here has shape (W,D) or is scalar (logelr)
    mu = logprem + logelr + alpha_aug + beta_aug

    # sigma will be broadcast automatically
    C = pm.Lognormal('C', mu=mu, sigma=sigma, shape=(W,D))

The key tricks are

  • prepending and appending the zeros onto alpha and beta, allow everything to be kept in a tensor form
  • using the tt.extra_ops.cumsum method to concisely express step 5;
  • getting all the terms in Step 6 to have the shape (W,D)

This could be simplified even further if one could perform an outer product between the alpha and beta vectors using an addition operator (e.g., in R the outer function allows arbitrary ops) but I couldn't find such a method under theano.tensor.

This doesn't really sample well using NUTS, but perhaps it might be better once you have actually observed values of C to plug in.

with model:
    # using lower target_accept and tuning throws divergences
    trace = pm.sample(tune=5000, draws=2000, target_accept=0.99)

pm.summary(trace, var_names=['alpha', 'beta', 'a', 'sigma'])

Since this is just the prior sampling, the only thing actually interesting are the distributions of the transformed variable sigma:

pm.plot_forest(trace, var_names=['sigma'])

enter image description here

which one can see coheres with the requirement that sigma_{d} > sigma_{d+1}.

Post a Comment for "Summing Rvs Using Pymc3"