DocumentationBuild StatusJuliaTesting
docs-stable docs-devCIJulia Code Style: BlueAqua QA codecov

HybridDynamicModels.jl

Lux.jl layers and utilities to build and train hybrid dynamic models.


HybridDynamicModels.jl is a toolbox for easily building and training hybrid dynamic models which combine mechanistic and data driven components. Built on top of the deep learning framework Lux.jl, it enables both gradient descent optimization and Bayesian inference.

๐Ÿš€ Key Features

Dynamic model layers

  • ICLayer: For initial condition inference
  • ODEModel: Neural ODEs
  • ARModel: Autoregressive models
  • AnalyticModel: For explicit dynamical models

Utility layers for hybrid modeling

  • ParameterLayer: Learnable parameters, composable with optional Constraint layers
  • BayesianLayer: Add probabilistic priors to any Lux layer

Data loaders

  • SegmentedTimeSeries: Time series data loader with segmentation, implementing mini-batching.

Training API, with following backends

๐Ÿ“ฆ Installation

using Pkg
Pkg.add("HybridDynamicModels")

๐Ÿ”ฅ Quick Start

Autoregressive hybrid Model

using HybridDynamicModels
using Random

# Dense layer for interactions
interaction_layer = Dense(2, 2, tanh)

# Parameter layer for growth/decay rates
rate_params = ParameterLayer(init_value = (growth = [0.1], decay = [0.05]))

# Simple hybrid dynamics: linear terms + neural interactions
function ar_step(layers, u, ps, t)
    # Linear terms from parameters
    params = layers.rates(ps.rates)
    growth = vcat(params.growth, - params.decay)
    
    # Neural network interactions
    interactions = layers.interaction(u, ps.interaction)

    return u .* (growth + interactions)
end

# Create autoregressive model
model = ARModel(
    (interaction = interaction_layer, rates = rate_params),
    ar_step;
    dt = 0.1)

# Setup and train
ps, st = Lux.setup(Random.default_rng(), model)
tsteps = range(0, stop=10.0, step=0.1)

preds, _ = model((; u0 = [1.0, 1.0], 
                tspan = (tsteps[1], tsteps[end]), 
                saveat = tsteps), ps, st)
size(preds)  # (2, 101)

Lux backend

using Optimisers
data = rand(2, length(tsteps))
dataloader = SegmentedTimeSeries((data, tsteps); segment_length=10, shift= 2)

backend = SGDBackend(Adam(1e-2), 100, AutoZygote(), MSELoss())
result = train(backend, model, dataloader, InferICs(false))

# Make predictions
tspan = (tsteps[1], tsteps[end])
prediction, _ = model((; u0 = result.ics[1].u0, 
                        tspan = tspan, 
                        saveat = tsteps), result.ps, result.st)

Turing backend

using Distributions, Turing

# Add priors to rate parameters
rate_priors = (
    growth = arraydist([Normal(0.1, 0.05)]),
    decay = arraydist([Normal(0.05, 0.02)])
)
nn_priors = Normal(0, 1)  # Example prior for NN weights

# Create Bayesian model
bayesian_model = ARModel(
    (interaction = BayesianLayer(interaction_layer, nn_priors), 
    rates = BayesianLayer(rate_params, rate_priors)),
    ar_step;
    dt = 0.1,
)

# MCMC training
datadistrib = Normal
mcmc_backend = MCSamplingBackend(NUTS(0.65), 500, datadistrib)
result = train(mcmc_backend, 
                bayesian_model, 
                dataloader, 
                InferICs(false))

# Sample from posterior
chains = result.chains
posterior_samples = sample(bayesian_model, chains, 50)

Learning Initial Conditions

# Learn initial conditions for each data segment
constraint_u0 = NamedTupleConstraint((; u0 = BoxConstraint([0.1, 0.1], [2.0, 2.0])))  # Reasonable bounds
infer_ics = InferICs(true, constraint_u0)

# Create model for initial condition learning
ic_model = ARModel(
    (interaction = interaction_layer, rates = rate_params),
    ar_step;
    dt = 0.1,
)

# Train with learned initial conditions
result = train(backend, ic_model, dataloader, infer_ics)

# Access learned initial conditions
for (i, ic) in enumerate(result.ics)
    println("Segment $i initial condition: ", ic.u0)
end

๐Ÿ“š Documentation

Examples

  • data_loading.jl: Demonstrates how to use the SegmentedTimeSeries data loader for batching and segmentation of time series data.

  • sgd_example.jl: Complete example showcasing gradient-based training with the SGD backend using real Lynx-Hare population data.

  • mcsampling_example.jl: Bayesian parameter estimation example using MCMC sampling with the MCSamplingBackend.

API

See the documentation.

๐Ÿ™ Acknowledgments

Built on the excellent LuxDL, SciML and TuringLang ecosystem, particularly: