Throughout the course we use Turing.jl for inference. Turing’s @model macro and ~ syntax make Bayesian models concise and readable. Under the hood, though, all Turing does is evaluate a log-density function — the same log-posterior you built by hand in the Introduction.
The Julia ecosystem has a standard interface for log-density functions: LogDensityProblems.jl. Any function conforming to this interface can be sampled by a range of MCMC packages — not just Turing. Knowing this pattern means you can work with any sampler in the ecosystem, and you’re not locked in to any single framework.
From manual log-posterior to standard interface
In the introduction, we wrote a function that simulates the SIR model and returns the log-likelihood:
usingDifferentialEquations, Distributions, DataFrames, CSV, Random, DrWatsonfunctionsir_ode!(du, u, p, t) R_0, D_inf = p β = R_0 / D_inf γ =1.0/ D_inf S, I, R = u N = S + I + R du[1] =-β * S * I / N du[2] = β * S * I / N - γ * I du[3] = γ * Iendfunctionsimulate_and_loglik(R_0, D_inf, data; init_state = [999.0, 1.0, 0.0], tspan = (0.0, 30.0)) prob =ODEProblem(sir_ode!, init_state, tspan, [R_0, D_inf]) sol =solve(prob, Tsit5(), saveat = data.time) I = sol[2, :] loglik =sum(logpdf.(Poisson.(max.(I, 1e-10)), data.obs))return (I = I, loglik = loglik)end
Precompiling packages...
1170.2 ms ✓ QuartoNotebookWorkerTablesExt (serial)
1 dependency successfully precompiled in 2 seconds
Precompiling packages...
1140.0 ms ✓ QuartoNotebookWorkerLaTeXStringsExt (serial)
1 dependency successfully precompiled in 2 seconds
simulate_and_loglik (generic function with 1 method)
To use this with DynamicHMC.jl, we need a callable struct — an object that acts as a function. It takes a named tuple of parameters (provided by the transformation layer) and returns the log-posterior:
The (p::SIRPosterior)(θ) syntax makes SIRPosterior callable — you can write posterior(θ) just like a function. TransformedLogDensities.jl wraps this callable with parameter transformations and automatically provides the LogDensityProblems interface that DynamicHMC expects.
Sampling with DynamicHMC.jl
DynamicHMC.jl is a standalone NUTS implementation. It uses TransformVariables.jl to map constrained parameters to unconstrained space (which NUTS requires):
# Load dataepi1 = CSV.read(datadir("epi1_synthetic.csv"), DataFrame)# Create the posteriorposterior =SIRPosterior(epi1)# Define parameter transformations (constrained → unconstrained)# R_0 ∈ (1, 20), D_inf ∈ (1, 14)trans =as((R_0 =as(Real, 1.0, 20.0), D_inf =as(Real, 1.0, 14.0)))# Wrap with transformations and automatic differentiationtransformed_posterior =TransformedLogDensity(trans, posterior)∇posterior =ADgradient(:ForwardDiff, transformed_posterior)# SampleRandom.seed!(42)results =mcmc_with_warmup(Random.default_rng(), ∇posterior, 2000)# Extract samples (back in constrained space)samples = [TransformVariables.transform(trans, col) for col ineachcol(results.posterior_matrix)]R_0_samples = [s.R_0 for s in samples]D_inf_samples = [s.D_inf for s in samples]println("R_0: mean = $(round(mean(R_0_samples), digits=2)), std = $(round(std(R_0_samples), digits=2))")println("D_inf: mean = $(round(mean(D_inf_samples), digits=2)), std = $(round(std(D_inf_samples), digits=2))")
R_0: mean = 2.32, std = 0.03
D_inf: mean = 1.99, std = 0.04