PiecewiseInference.jl
PiecewiseInference.PiecewiseInferencePiecewiseInference.InferenceProblemPiecewiseInference.AICPiecewiseInference.AICPiecewiseInference.AICPiecewiseInference.AICcPiecewiseInference.AICc_TREEPiecewiseInference.FIM_strouwenPiecewiseInference.FIM_yazdaniPiecewiseInference.R2PiecewiseInference.RSSPiecewiseInference.compute_cisPiecewiseInference.compute_cisPiecewiseInference.divisorsPiecewiseInference.estimate_σPiecewiseInference.get_evidencePiecewiseInference.get_rangesPiecewiseInference.group_ranges_gnPiecewiseInference.group_ranges_gsPiecewiseInference.inferencePiecewiseInference.iterative_inferencePiecewiseInference.loss_param_prior_from_dictPiecewiseInference.momentsPiecewiseInference.moments!PiecewiseInference.piecewise_ML_indep_TSPiecewiseInference.piecewise_lossPiecewiseInference.pretty_print
PiecewiseInference.PiecewiseInference — ModulePiecewiseInference.jl
Suite for inverse modelling of dynamical systems characterised by complex dynamics.
PiecewiseInference.jl implements a segmentation method that improves the convergence of local search methods by regularizing the inference problem.

Installation
PiecewiseInference.jl has ParametricModels.jl in its dependency, a non-registered package. As such, to install PiecewiseInference.jl, you'll need to first add an extra registry to your Julia installation that tracks both ParametricModels.jl and PiecewiseInference.jl.
To proceed, open Julia and type the following
using Pkg
pkg"registry add https://github.com/vboussange/VBoussangeRegistry.git"Then go on and
pkg"add PiecewiseInference"That's it! This will download the latest version of PiecewiseInference.jl from this git repo and download all dependencies.
Getting started
See the documentation and the test folder for up-to-date examples.
Related packages
DiffEqFlux is a package with similar goals as PiecewiseInference, and proposes the method DiffEqFlux.multiple_shooting, which is close to PiecewiseInference.inference but where initial conditions are not inferred. PiecewiseInference further proposes several utility methods for model selection.
Reference
- Boussange, V., Vilimelis-Aceituno, P., Pellissier, L., Mini-batching ecological data to improve ecosystem models with machine learning bioRxiv, 46 pages.
PiecewiseInference.InferenceProblem — MethodInferenceProblem(
model,
p0;
p_bij,
u0_bij,
loss_param_prior,
loss_u0_prior,
loss_likelihood
)
Args
model: a model of typeAbstractModel.p0: initial guess of the parameters. Should be a named tuple.
Optional
p_bij: a dictionary containing bijectors, to constrain parameter values and initial conditionsloss_param_prioris a function with argumentsp::NamedTuple. Should correspond to parameter priors.
By default, loss_param_prior(p) = 0.
loss_u0_prioris a function with argumentsu0_pred, u0_data. Should correspond to IC priors.
By default it corresponds to RSS between u0 and the corresponding data point.
loss_likelihoodis a function that matches the predictions and the data,
which should have as arguments data, pred, rng. By default, it corresponds to the RSS.
PiecewiseInference.AIC — MethodAIC(RSS, k, m)Calculate AIC of a model given its RSS, k its number of parameters, and m the number of observations.
PiecewiseInference.AIC — MethodAIC(data, tsteps, infres)
Computes the AIC of res given the observational noise distribution noisedistrib.
PiecewiseInference.AIC — MethodAIC(logl, nparams)
Computes the AIC given loglikelihood logl and number of parameters nparams.
PiecewiseInference.AICc — MethodAICc(aic, k, m)Calculate AIC corrected of a model given its aic, k its number of parameters, and m the number of observations (if d variables and T time steps, m = d * T). The formula is taken from Mangan2017.
PiecewiseInference.AICc_TREE — MethodAICc_TREE(RSS, k, m)
Calculate aic of a model given its RSS, k its number of parameters, and m the number of observations. The formula is taken from TREE article.
PiecewiseInference.FIM_strouwen — MethodFIM_strouwen(predict, θ, Σ)Returns the FIM matrix associated to predict and evaluated at θ taken from https://arnostrouwen.com/post/dynamic-experimental-design/
predictis a function that takesθas a parameter and returns an array- with dim=1 corresponding to state variables and
- dim = 2 corresponding to the time steps
θthe parameter vector where to evaluate the FIMΣis the variance-covariance matrix of the observation errors
PiecewiseInference.FIM_yazdani — MethodFIM_yazdani(dudt, u0_true, tspan, p_true, Σ)Returns the FIM matrix associated to problem defined by prob = ODEProblem(dudt, u0_true, tspan, p_true). Σ is the variance-covariance matrix of the observation errors
PiecewiseInference.R2 — MethodR2(odedata, pred, _)
We distinguish between R2 for log transformed values (with `dist::MvLogNormal` as last argument)
and standard R2, to discard non positive values in the former case.PiecewiseInference.RSS — MethodRSS(res, data_set, noisedistrib)
Computes the RSS of res given data_set.
Argument:
fnis set toidentitybut can be e.g.logif lognormal loss used.
PiecewiseInference.compute_cis — Methodcompute_cis(reseco, odedata, noisedistrib, p, α, σ)
PiecewiseInference.compute_cis — Methodcompute_cis(var_cov_matrix, p, α)
Compute confidence intervals, given var_cov_matrix, parameters p and a confidence level α.
PiecewiseInference.divisors — Methoddivisors(n)Returns all divisors of n, sorted.
PiecewiseInference.estimate_σ — Methodestimate_σ(pred, odedata, _)
Estimate noise variance, assuming similar noise across all the dimensions of the data.
PiecewiseInference.get_evidence — Methodget_evidence(data, tsteps, infprob, p, u0s, ranges)
Provides evidence p(M|data) = ∫p(data|M, θ) p(θ) p(M) dθ for model M stored in infprob given the data, using MAP estimate p and u0s. Here it is assumed that p(M) = 1.
Relies on Laplace's method
Note
For now, we do not integrate over initial conditions u0s, but this may be considered.
PiecewiseInference.get_ranges — Methodget_ranges(; group_size, group_nb, datasize)
PiecewiseInference.group_ranges_gn — Methodgroup_ranges_gn(datasize, group_nb)
Similar to `group_ranges`, except that it takes as arguments the nb of groups wanted, `group_nb`PiecewiseInference.group_ranges_gs — Methodgroup_ranges_gs(datasize, group_size)
Get ranges that partition data of length datasize in groups of groupsize observations. If the data isn't perfectly dividable by groupsize, the last group contains the reminding observations. Taken from https://github.com/SciML/DiffEqFlux.jl/blob/80c4247c19860d0422211d6a65283d896eeaa831/src/multiple_shooting.jl#L273-L303
group_ranges(datasize, groupsize)Arguments:
datasize: amount of data points to be partitionedgroupsize: maximum amount of observations in each group
Example:
julia> group_ranges(10, 5)
3-element Vector{UnitRange{Int64}}:
1:5
5:9
9:10PiecewiseInference.inference — Methodinference(infprob; group_size, group_nb, kwargs...)
Piecewise inference. Loops through the optimizers optimizers. Returns a InferenceResult.
Arguments
infprob: the InferenceProblemopt: array of optimizersgroup_size: size of segmentsgroup_nb: alternatively togroup_size, one can ask for a certain number of segmentsdata: datatsteps: corresponding to data
Optional
u0_init: if not provided, we initialise fromdataoptimizers: array of optimizers, e.g.[Adam(0.01)]epochs: number of epochs, which length should match that ofoptimizersbatchsizes: array of batch size, which length should match that ofoptimizersverbose_loss: displaying lossinfo_per_its= 50,plotting: plotting convergence lossinfo_per_its= 50,cb: call back function. Must be of the formcb(θs, p_trained, losses, pred, ranges)threshold: default to 1e-6save_pred = truesaves predictionssave_losses = truesaves lossesadtype = Optimization.AutoForwardDiff(): AD type to be used. Can beOptimization.AutoForwardDiff()
for forward AD, or Optimization.Autozygote() for backward AD.
multi_threading = true: iftrue, segments in the piecewise loss are computed in parallel.
Currently not supported with adtype = Optimization.Autozygote()
Examples
using SciMLSensitivity # provides diffential equation sensitivity methods
using UnPack # provides the utility macro @unpack
using OptimizationOptimisers, OptimizationFlux # provide the optimizers
using LinearAlgebra
using ParametricModels
using PiecewiseInference
using OrdinaryDiffEq
using Distributions, Bijectors # used to constrain parameters and initial conditions
@model MyModel
function (m::MyModel)(du, u, p, t)
@unpack b = p
du .= 0.1 .* u .* ( 1. .- b .* u)
end
tsteps = 1.:0.5:100.5
tspan = (tsteps[1], tsteps[end])
p_true = (b = [0.23, 0.5],)
p_init= (b = [1., 2.],)
# Defining the model
# Pay attention to the semi column before the parameters for `ModelParams`
u0 = ones(2)
mp = ModelParams(;p = p_true,
tspan,
u0,
alg = BS3(),
sensealg = ForwardDiffSensitivity(),
saveat = tsteps,
)
model = MyModel(mp)
sol_data = ParametricModels.simulate(model)
ode_data = Array(sol_data)
# adding some normally distributed noise
σ_noise = 0.1
ode_data_wnoise = ode_data .+ randn(size(ode_data)) .* σ_noise
# Define the `InferenceProblem`
# First specifiy which values can the parameter take with bijectors
# here, `b` is constrained to be ∈ [1e-3, 5e0] and `u0` ∈ [1e-3, 5.]
p_bij = (bijector(Uniform(1e-3, 5e0)),)
u0_bij = bijector(Uniform(1e-3,5.))
distrib_noise = MvNormal(ones(2) * σ_noise^2)
# defining `loss_likelihood`
loss_likelihood(data, pred, rng) = sum(logpdf(distrib_noise, data .- pred))
infprob = InferenceProblem(model, p_init; p_bij, u0_bij)
optimizers = [ADAM(0.001)]
epochs = [5000]
group_nb = 2
batchsizes = [1] # batch size used for each optimizer in optimizers (here only one)
# you could also have `batchsizes = [group_nb]`
res = inference(infprob,
group_nb = group_nb,
data = ode_data_wnoise,
tsteps = tsteps,
epochs = epochs,
optimizers = optimizers,
batchsizes = batchsizes,
)
p_trained = get_p_trained(res)
pred = res.predPiecewiseInference.iterative_inference — Methoditerative_inference(
infprob;
group_sizes,
group_nbs,
optimizers_array,
threshold,
kwargs...
)
Performs a iterative piecewise MLE, iterating over group_sizes. Stops the iteration when loss function increases between two iterations.
Returns an array with all InferenceResult obtained during the iteration. For kwargs, see inference.
Note
- for now, does not support independent time series (
piecewise_ML_indep_TS). - at every iteration, initial conditions are initialised given the predition of previous iterations
Specific arguments
group_sizes: array of group sizes to testoptimizers_array: optimizersarray[i] is an array of optimizers for the trainging process of `groupsizes[i]`
PiecewiseInference.loss_param_prior_from_dict — Methodloss_param_prior_from_dict(params, param_distrib)
Returns the loglikelihood of params given the prior distribution of the parameters param_distrib
params: params, in the form of NamedTupleparam_distrib: in the form of aDictionaryor aNamedTuple, with entriesp::String=> "d::Distribution"
PiecewiseInference.moments! — Methodmoments!(xx, x)reurns the moments of x as a vector stored in xx
- Args
xx: the covariate vectorx: the state variable vector
PiecewiseInference.moments — Methodmoments(x)reurns the moments of x as a vector
- Args
xx: the covariate vectorx: the state variable vector
PiecewiseInference.piecewise_ML_indep_TS — Methodpiecewise_ML_indep_TS(
infprob;
data,
group_size,
group_nb,
tsteps,
save_pred,
save_losses,
kwargs...
)
Similar to inference but for independent time series, where data is a vector containing the independent arrays corresponding to the time series, and tsteps is a vector where each entry contains the time steps of the corresponding time series.
PiecewiseInference.piecewise_loss — Methodpiecewise_loss(
infprob,
θ,
ode_data,
tsteps,
ranges,
idx_rngs
)
piecewise_loss(
infprob,
θ,
ode_data,
tsteps,
ranges,
idx_rngs,
multi_threading
)
Returns a tuple (loss, pred) based on the segmentation of ode_data into segments with time steps given by tsteps[ranges[i]]. The initial conditions are assumed free parameters for each segments.
Arguments:
infprob: the inference problemθ: [u0,p] wherepcorresponds to the parameters of ode function in the optimization space.ode_data: Original Data to be modeloss_likelihooded.tsteps: Timesteps on which ode_data was calculated.ranges: Vector containg range for each segment.idx_rngs: Vector containing the indices of the segments to be included in the loss
PiecewiseInference.pretty_print — Functionpretty_print(p_1)
pretty_print(p_1, p_2)
Prints in a nice format the NamedTuple or Dict p_1. If p_true provided, also display its values for comparisions.