Beyond Turing: the LogDensityProblems interface

The log-density is all you need

Throughout the course we use Turing.jl for inference. Turing’s @model macro and ~ syntax make Bayesian models concise and readable. Under the hood, though, all Turing does is evaluate a log-density function — the same log-posterior you built by hand in the Introduction.

The Julia ecosystem has a standard interface for log-density functions: LogDensityProblems.jl. Any function conforming to this interface can be sampled by a range of MCMC packages — not just Turing. Knowing this pattern means you can work with any sampler in the ecosystem, and you’re not locked in to any single framework.

From manual log-posterior to standard interface

In the introduction, we wrote a function that simulates the SIR model and returns the log-likelihood:

using DifferentialEquations, Distributions, DataFrames, CSV, Random, DrWatson

function sir_ode!(du, u, p, t)
    R_0, D_inf = p
    β = R_0 / D_inf
    γ = 1.0 / D_inf
    S, I, R = u
    N = S + I + R

    du[1] = -β * S * I / N
    du[2] = β * S * I / N - γ * I
    du[3] = γ * I
end

function simulate_and_loglik(R_0, D_inf, data; init_state = [999.0, 1.0, 0.0], tspan = (0.0, 30.0))
    prob = ODEProblem(sir_ode!, init_state, tspan, [R_0, D_inf])
    sol = solve(prob, Tsit5(), saveat = data.time)
    I = sol[2, :]
    loglik = sum(logpdf.(Poisson.(max.(I, 1e-10)), data.obs))
    return (I = I, loglik = loglik)
end
Precompiling packages...
   1170.2 msQuartoNotebookWorkerTablesExt (serial)
  1 dependency successfully precompiled in 2 seconds
Precompiling packages...
   1140.0 msQuartoNotebookWorkerLaTeXStringsExt (serial)
  1 dependency successfully precompiled in 2 seconds
simulate_and_loglik (generic function with 1 method)

To use this with DynamicHMC.jl, we need a callable struct — an object that acts as a function. It takes a named tuple of parameters (provided by the transformation layer) and returns the log-posterior:

using LogDensityProblems, TransformVariables
using DynamicHMC, LogDensityProblemsAD, TransformedLogDensities

struct SIRPosterior{T}
    data::T
end

function (p::SIRPosterior)(θ)
    R_0, D_inf = θ.R_0, θ.D_inf
    # Priors
    lp = logpdf(Uniform(1, 20), R_0) + logpdf(Uniform(1, 14), D_inf)
    isinf(lp) && return lp  # Outside prior support
    # Likelihood
    result = simulate_and_loglik(R_0, D_inf, p.data)
    return lp + result.loglik
end

The (p::SIRPosterior)(θ) syntax makes SIRPosterior callable — you can write posterior(θ) just like a function. TransformedLogDensities.jl wraps this callable with parameter transformations and automatically provides the LogDensityProblems interface that DynamicHMC expects.

Sampling with DynamicHMC.jl

DynamicHMC.jl is a standalone NUTS implementation. It uses TransformVariables.jl to map constrained parameters to unconstrained space (which NUTS requires):

# Load data
epi1 = CSV.read(datadir("epi1_synthetic.csv"), DataFrame)

# Create the posterior
posterior = SIRPosterior(epi1)

# Define parameter transformations (constrained → unconstrained)
# R_0 ∈ (1, 20), D_inf ∈ (1, 14)
trans = as((R_0 = as(Real, 1.0, 20.0), D_inf = as(Real, 1.0, 14.0)))

# Wrap with transformations and automatic differentiation
transformed_posterior = TransformedLogDensity(trans, posterior)
∇posterior = ADgradient(:ForwardDiff, transformed_posterior)

# Sample
Random.seed!(42)
results = mcmc_with_warmup(Random.default_rng(), ∇posterior, 2000)

# Extract samples (back in constrained space)
samples = [TransformVariables.transform(trans, col) for col in eachcol(results.posterior_matrix)]
R_0_samples = [s.R_0 for s in samples]
D_inf_samples = [s.D_inf for s in samples]

println("R_0: mean = $(round(mean(R_0_samples), digits=2)), std = $(round(std(R_0_samples), digits=2))")
println("D_inf: mean = $(round(mean(D_inf_samples), digits=2)), std = $(round(std(D_inf_samples), digits=2))")
R_0: mean = 2.32, std = 0.03
D_inf: mean = 1.99, std = 0.04
using Plots, StatsPlots

p1 = histogram(R_0_samples, normalize=:pdf, xlabel="R_0", ylabel="Density",
               title="R_0 posterior", label="DynamicHMC", alpha=0.7)
p2 = histogram(D_inf_samples, normalize=:pdf, xlabel="D_inf", ylabel="Density",
               title="D_inf posterior", label="DynamicHMC", alpha=0.7)
plot(p1, p2, layout=(1, 2), size=(800, 350))
Precompiling packages...
   2211.0 msQuartoNotebookWorkerJSONExt (serial)
  1 dependency successfully precompiled in 3 seconds
Precompiling packages...
   2534.7 msQuartoNotebookWorkerPlotsExt (serial)
  1 dependency successfully precompiled in 3 seconds

Same data, same log-posterior, same NUTS algorithm — no Turing involved.

When to use this

  • Turing works: Use Turing. The @model syntax is clearer and faster to write.
  • You hit a Turing issue you can’t resolve: You can bypass the Turing stack entirely by working with the log-density directly.
  • You want a sampler Turing doesn’t offer: LogDensityProblems is the standard way to plug into alternative samplers.
  • You want fewer dependencies: For deployed code, a minimal stack (your model + AD + sampler) can be easier to maintain.

Compatible samplers

Any sampler supporting the LogDensityProblems interface works, including:

References