You can model this scenario in PyMC2, and in a sense it is easy to do so. But it is also hard to do, so I will demonstrate a solution for the special case of a model where $b-a = d-c$. I say it is easy because PyMC2 can use...

Yes, its possible to make something with a complex or arbitrary likelihood. Though that doesn't seem like what you're doing here. It looks like you have a complex transformation of one variable into another, the integration step. Your particular exception is that integrate.quad is expecting a numpy array, not a...

You need to construct a deterministic function that generates p_diabetes as a function of your predictors. The safest way to do this is via a logit-linear transformation. For example: intercept = pymc.Normal('intercept', 0, 0.01, value=0) beta_race = pymc.Normal('beta_race', 0, 0.01, value=np.zeros(4)) beta_bmi = pymc.Normal('beta_bmi', 0, 0.01, value=0) @pymc.deterministic def p_diabetes(b0=intercept,...

This uninformative error is due to the way you have organized your data vector. It is 2 rows by n columns, and PyMC expects it to be n rows by 2 columns. The following modification makes this code (almost) work for me: xy = MvNormal('xy', mu=mean, tau=precision, value=data.T, observed=True) I...

There are perhaps too many ways to create a model in PyMC2. The one you used, passing an iterable of pymc.Node instances, does not record the names, so the model doesn't have an M.a, even though M.nodes contains a stochastic named 'a'. If you prefer to create your model this...

You've switched things just right, but if you want to model multiple observed values as independent, you can sum them in the joint log-likelihood: @pymc.observed def age(value=np.array([12, 43, 28, 39, 87, 26])): return sum(map(age_logp, value)) You can combine them some other way if you prefer. The key is that age...

To run them serially, you can use a similar approach to your PyMC 2 example. The main difference is that each call to sample returns a multi-chain trace instance (containing just a single chain in this case). merge_traces will take a list of multi-chain instances and create a single instance...

image-processing,python-3.x,bayesian,pymc

The closest I know of is here: http://nbviewer.ipython.org/github/CamDavidsonPilon/Probabilistic-Programming-and-Bayesian-Methods-for-Hackers/blob/master/Chapter5_LossFunctions/LossFunctions.ipynb (see Example: Kaggle contest on Observing Dark World) On this thread you asked a specific question: https://github.com/pymc-devs/pymc/issues/653 about finding an array in an image. Here is a first attempt at a model: In that case it seems like you are trying to...

I think that this is a bug (here is a link to the issue you opened, thanks!). Here is a work around you can use for now: instead of the creating observations as you have done above, use n and p arguments which have dimension matching data: observations = Binomial("obs",...

bayesian,pymc,multi-level,pymc3

The error means the optimization algorithm finished but returned values that don't make any sense. Usually this is because the maximum isn't well defined. However, this actually worked fine for me. What versions do you have? I have python 2.7, latest pymc3, theano 0.7.0, scipy 0.13...

Not sure what you mean by theta, since there is no theta in your model. Are you referring to the coin-specific probabilities (which are here represented by mint)? You have specified a single probability for all the coins, rather than 4 probabilities. Try modifying your mint parameter to: mint =...

When you create a your normal stochastastic with pymc.Normal('w0', 0, 0.000001), PyMC2 initializes the value with a random draw from the prior distribution. Since your prior is so diffuse, this can be a value which is so unlikely that the posterior is effectively zero. To fix, just request a reasonable...

python,parallel-processing,pymc,pymc3

It looks like you are using PyMC2, and as far as I know, you must use some Python approach to parallel computation, like IPython.parallel. There are many ways to do this, but all the ones I know are a little bit complicated. Here is an example of one, which uses...

python,probability,pymc,dirichlet

The Dirichlet distribution is a continuous distribution, so its density may be greater than 1. Remember that a continuous density must be nonnegative, and its integral must be 1. But it is not required that the density be less than 1 everywhere. About your second question -- my advice is...

Here is my translation of your PyMC2 model: model = pm.Model() with pm.Model() as model: # global model parameters home = pm.Normal('home', 0, .0001) tau_att = pm.Gamma('tau_att', .1, .1) tau_def = pm.Gamma('tau_def', .1, .1) intercept = pm.Normal('intercept', 0, .0001) # team-specific model parameters atts_star = pm.Normal("atts_star", mu =0, tau =tau_att,...

You have been bit by a sneaky part of Python. The global import of pymc replaced your numpy exp with a different exp. To get the exp you want, you can use np.exp in your sigmoid deterministic. (Where did np. come from, I wonder?) return np.exp(1.0 / (1.0 + np.exp(bs...

PyMC actually turns switchpoint into a regular variable. So you just need to do test = switchpoint It looks strange because you're defining it as a decorated function, but that's not actually how you're supposed to use it. It makes more sense if you look at the other ways you...

If I understand this correctly, it is a nice example to demonstrate some of the differences in thinking between Anglican and PyMC. Here is a tweaked version of your PyMC code that I think captures your intention: def make_model(): a = pymc.Poisson("a", 100) # better to have the stochastics themselves...

The best approach is to code a self-tuning algorithm that starts with an arbitrary variance for the step size variance, and tune this variance as the algorithm progresses. You are shooting for an acceptance rate of 25-50% for the Metropolis algorithm.

The only way sequential updating works sensibly is in two different models. Specifying them in the same model does not make any sense, since we have no posteriors until after MCMC has completed. In principle, you would examine the distribution of theta1 and specify a prior that best resembles it....

python,bayesian,pymc,survival-analysis

For posterity. Thanks to Chris Fonnesbeck for pointing out that the problem was that I did not give W as an argument to idt. This function should be @deterministic def idt(b1=beta, dl0=dL0, W=W): print beta.value fitted = np.exp(np.dot(X, b1) + W[stfips]) yy = (Y[:,:T] * np.outer(fitted, dl0)) return np.transpose(yy) And...