Error in Black Box Likelihood Function Example with PyMC

Hello,

I’m trying out the Using a “black box” likelihood function example, but I’m encountering an error and it’s not working as expected.

Specifically, the error occurs with the following code:

grad_model.compile_dlogp()(ip)

The error message is as follows:

---------------------------------------------------------------------------
NotImplementedError                       Traceback (most recent call last)
Cell In[24], line 1
----> 1 grad_model.compile_dlogp()(ip)

File d:\anaconda\envs\pymc\Lib\site-packages\pymc\model\core.py:620, in Model.compile_dlogp(self, vars, jacobian, **compile_kwargs)
    604 def compile_dlogp(
    605     self,
    606     vars: Variable | Sequence[Variable] | None = None,
    607     jacobian: bool = True,
    608     **compile_kwargs,
    609 ) -> PointFunc:
    610     """Compiled log probability density gradient function.
    611 
    612     Parameters
   (...)
    618         Whether to include jacobian terms in logprob graph. Defaults to True.
    619     """
--> 620     return self.compile_fn(self.dlogp(vars=vars, jacobian=jacobian), **compile_kwargs)

File d:\anaconda\envs\pymc\Lib\site-packages\pymc\model\core.py:760, in Model.dlogp(self, vars, jacobian)
    758 cost = self.logp(jacobian=jacobian)
    759 cost = rewrite_pregrad(cost)
--> 760 return gradient(cost, value_vars)
...
---> 32     raise NotImplementedError("Gradient only implemented for scalar m and c")
     34 grad_wrt_m, grad_wrt_c = loglikegrad_op(m, c, sigma, x, data)
     36 # out_grad is a tensor of gradients of the Op outputs wrt to the function cost

NotImplementedError: Gradient only implemented for scalar m and c

Environment details:

  • python : 3.12.8
  • pytensor : 2.26.4
  • pymc : 5.19.1
  • arviz : 0.20.0
  • matplotlib: 3.10.0
  • scipy : 1.14.1
  • numpy : 1.26.4

I would appreciate it if you could help me understand the cause of this error and how to resolve it.

Hello, I have met same error as yours.
I tried to construct my own likelihood function with gradient refered to [Using a “black box” likelihood function]. But it failed in the “model.compile_dlogp()(ip)”, occuring same NotImplementedError.Then, I tried to run the example codes in colab. It turns out the codes in example doesn’t work :sweat_smile:
After a lot of investigations, I figure it out what happen !!!
First, if you delete the sentence in grad():" if m.type.dim!=0 ……", you would meet an error: “LokLIkeWithGrad.grad returned a term with 0 dimensions, but 1 are required”, and
the above sentence: “if hasattr(var, “ndim”) and term.ndim != var.ndim: raise ValueError(……)”
These errors mean that the first term returned from grad was a scalar, but the variable from the node is a vector with shape(1,).
The problem is clarified! And the solution is quite simple:
In make_node() of LogLikeWithGrad(Op), after converting inputs to tensor variables “m=pt.as_tensor(m)”, we could add a sentence “m = m[0]”, that could convert a vector with shape(1,) into a scalar with same dimensioin as the output of grad(). Besides, we could add some dimension judgment statement (such as "if m.type.ndim!=0 ") to avoid potential error.
Hope my experience could help you.

Thanks for investigating, I’ll take a look