Overview

Neural inferential methods have marked practical appeal, as their implementation is only loosely connected to the statistical or physical model being considered. The workflow when using the package NeuralEstimators is as follows:

  1. Sample parameters from the prior, $\pi(\boldsymbol{\theta})$, to form training/validation/test parameter sets. Alternatively, define a function to sample parameters dynamically during training. Parameters are stored as $d \times K$ matrices, with $d$ the dimensionality of the parameter vector and $K$ the number of parameter vectors in the given parameter set.

  2. Simulate data from the model conditional on the above parameter sets, to form training/validation/test data sets. Alternatively, define a function to simulate data dynamically during training. Simulated data sets are stored as mini-batches in a format amenable to the chosen neural-network architecture (see Step 3).

  3. Design and initialise a suitable neural network that maps data to $\mathbb{R}^{d^*}$ (i.e., learned summary statistics of user-specified dimension $d^*$). The architecture class (e.g., MLP, CNN, GNN, DeepSet) should reflect structure of the data (e.g., unstructured, grid, graph, replicated), and any Flux model can be used. Given $K$ data sets stored appropriately (see Step 2), the neural network should output a $d^* \times K$ matrix.

  4. Construct a NeuralEstimator by wrapping the neural network in the subtype corresponding to the intended inferential method:

  5. Train the NeuralEstimator using train() and the training set, monitoring performance and convergence using the validation set. For generic neural Bayes estimators, specify a loss function.

  6. Assess the NeuralEstimator using assess() and the test set.

Once the NeuralEstimator has passed our assessments and is deemed to be well calibrated, it may be used to make inference with observed data.

Next, see the Examples and, once familiar with the basic workflow, see Advanced usage for further practical considerations on how to most effectively construct neural estimators.