Skip to content

Commit

Permalink
Merge pull request #111 from daniel-habermann/Development
Browse files Browse the repository at this point in the history
fix some typos and out-of-sync docstrings
  • Loading branch information
stefanradev93 authored Nov 21, 2023
2 parents dcc1dfd + 73051ed commit 7da2016
Show file tree
Hide file tree
Showing 10 changed files with 42 additions and 42 deletions.
4 changes: 2 additions & 2 deletions bayesflow/computational_utilities.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ def posterior_calibration_error(
The random draws from the approximate posteriors over ``num_datasets``
prior_samples : np.ndarray of shape (num_datasets, num_params)
The corresponding ground-truth values sampled from the prior
alpha_resolution : int, optional, default: 100
alpha_resolution : int, optional, default: 20
The number of credibility intervals (CIs) to consider
aggregator_fun : callable or None, optional, default: np.median
The function used to aggregate the marginal calibration errors.
Expand Down Expand Up @@ -233,7 +233,7 @@ def mmd_kernel_unbiased(x, y, kernel):
Returns
-------
loss : tf.Tensor of shape (,)
The statistically unbiaserd squared maximum mean discrepancy (MMD) value.
The statistically unbiased squared maximum mean discrepancy (MMD) value.
"""

m, n = x.shape[0], y.shape[0]
Expand Down
2 changes: 1 addition & 1 deletion bayesflow/configuration.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@


class DefaultJointConfigurator:
"""Fallback class for a generic configrator for joint posterior and likelihood approximation."""
"""Fallback class for a generic configurator for joint posterior and likelihood approximation."""

def __init__(self, default_float_type=np.float32):
self.posterior_config = DefaultPosteriorConfigurator(default_float_type=default_float_type)
Expand Down
10 changes: 5 additions & 5 deletions bayesflow/coupling_networks.py
Original file line number Diff line number Diff line change
Expand Up @@ -313,7 +313,7 @@ def _calculate_spline(self, target, spline_params, inverse=False):
target : tf.Tensor of shape (batch_size, ..., dim_2)
The target partition of the input vector to transform.
spline_params : tuple(tf.Tensor,...)
A tuple with tensors corresponding to the learnbale spline features:
A tuple with tensors corresponding to the learnable spline features:
(left_edge, bottom_edge, widths, heights, derivatives)
inverse : bool, optional, default: False
Flag indicating whether to run the block forward or backward.
Expand Down Expand Up @@ -516,7 +516,7 @@ def __init__(
for the required entries.
coupling_design : str or callable, optional, default: 'affine'
The type of internal coupling network to use. Must be in ['affine', 'spline'].
In general, spline couplings run slower than affine couplings, but require fewers coupling
In general, spline couplings run slower than affine couplings, but requires fewer coupling
layers. Spline couplings may work best with complex (e.g., multimodal) low-dimensional
problems. The difference will become less and less pronounced as we move to higher dimensions.
permutation : str or None, optional, default: 'fixed'
Expand Down Expand Up @@ -581,15 +581,15 @@ def __init__(
self.act_norm = None

def call(self, target_or_z, condition, inverse=False, **kwargs):
"""Performs one pass through a the affine coupling layer (either inverse or forward).
"""Performs one pass through the affine coupling layer (either inverse or forward).
Parameters
----------
target_or_z : tf.Tensor
The estimation quantites of interest or latent representations z ~ p(z), shape (batch_size, ...)
The estimation quantities of interest or latent representations z ~ p(z), shape (batch_size, ...)
condition : tf.Tensor or None
The conditioning data of interest, for instance, x = summary_fun(x), shape (batch_size, ...).
If `condition is None`, then the layer recuces to an unconditional ACL.
If `condition is None`, then the layer reduces to an unconditional ACL.
inverse : bool, optional, default: False
Flag indicating whether to run the block forward or backward.
Expand Down
40 changes: 20 additions & 20 deletions bayesflow/diagnostics.py
Original file line number Diff line number Diff line change
Expand Up @@ -84,9 +84,9 @@ def plot_recovery(
The parameter names for nice plot titles. Inferred if None
fig_size : tuple or None, optional, default : None
The figure size passed to the matplotlib constructor. Inferred if None.
label_fontsize : int, optional, default: 14
label_fontsize : int, optional, default: 16
The font size of the y-label text
title_fontsize : int, optional, default: 16
title_fontsize : int, optional, default: 18
The font size of the title text
metric_fontsize : int, optional, default: 16
The font size of the goodness-of-fit metric (if provided)
Expand Down Expand Up @@ -114,7 +114,7 @@ def plot_recovery(
Raises
------
ShapeError
If there is a deviation form the expected shapes of ``post_samples`` and ``prior_samples``.
If there is a deviation from the expected shapes of ``post_samples`` and ``prior_samples``.
"""

# Sanity check
Expand Down Expand Up @@ -252,12 +252,12 @@ def plot_z_score_contraction(
post_contraction = 1 - (posterior_variance / prior_variance)
In other words, the posterior is a proxy for the reduction in uncertainty gained by
In other words, the posterior contraction is a proxy for the reduction in uncertainty gained by
replacing the prior with the posterior. The ideal posterior contraction tends to 1.
Contraction near zero indicates that the posterior variance is almost identical to
the prior variance for the particular marginal parameter distribution.
Note: Means and variances will be estimated vie their sample-based estimators.
Note: Means and variances will be estimated via their sample-based estimators.
[1] Schad, D. J., Betancourt, M., & Vasishth, S. (2021).
Toward a principled Bayesian workflow in cognitive science.
Expand All @@ -275,9 +275,9 @@ def plot_z_score_contraction(
The parameter names for nice plot titles. Inferred if None
fig_size : tuple or None, optional, default : None
The figure size passed to the matplotlib constructor. Inferred if None.
label_fontsize : int, optional, default: 14
label_fontsize : int, optional, default: 16
The font size of the y-label text
title_fontsize : int, optional, default: 16
title_fontsize : int, optional, default: 18
The font size of the title text
tick_fontsize : int, optional, default: 12
The font size of the axis ticklabels
Expand All @@ -295,7 +295,7 @@ def plot_z_score_contraction(
Raises
------
ShapeError
If there is a deviation form the expected shapes of ``post_samples`` and ``prior_samples``.
If there is a deviation from the expected shapes of ``post_samples`` and ``prior_samples``.
"""

# Sanity check for shape integrity
Expand Down Expand Up @@ -421,7 +421,7 @@ def plot_sbc_ecdf(
The font size of the y-label and y-label texts
legend_fontsize : int, optional, default: 14
The font size of the legend text
title_fontsize : int, optional, default: 16
title_fontsize : int, optional, default: 18
The font size of the title text. Only relevant if `stacked=False`
tick_fontsize : int, optional, default: 12
The font size of the axis ticklabels
Expand Down Expand Up @@ -587,11 +587,11 @@ def plot_sbc_histograms(
The figure size passed to the matplotlib constructor. Inferred if None
num_bins : int, optional, default: 10
The number of bins to use for each marginal histogram
binomial_interval : float in (0, 1), optional, default: 0.95
binomial_interval : float in (0, 1), optional, default: 0.99
The width of the confidence interval for the binomial distribution
label_fontsize : int, optional, default: 14
label_fontsize : int, optional, default: 16
The font size of the y-label text
title_fontsize : int, optional, default: 16
title_fontsize : int, optional, default: 18
The font size of the title text
tick_fontsize : int, optional, default: 12
The font size of the axis ticklabels
Expand Down Expand Up @@ -1077,7 +1077,7 @@ def plot_calibration_curves(
The font size of the y-label and y-label texts
legend_fontsize : int, optional, default: 14
The font size of the legend text (ECE value)
title_fontsize : int, optional, default: 16
title_fontsize : int, optional, default: 18
The font size of the title text. Only relevant if `stacked=False`
tick_fontsize : int, optional, default: 12
The font size of the axis ticklabels
Expand Down Expand Up @@ -1296,20 +1296,20 @@ def plot_mmd_hypothesis_test(
The samples from the MMD sampling distribution under the null hypothesis "the model is well-specified"
mmd_observed : float
The observed MMD value
alpha_level : float
alpha_level : float, optional, default: 0.05
The rejection probability (type I error)
null_color : str or tuple
null_color : str or tuple, optional, default: (0.16407, 0.020171, 0.577478)
The color of the H0 sampling distribution
observed_color : str or tuple
observed_color : str or tuple, optional, default: "red"
The color of the observed MMD
alpha_color : str or tuple
alpha_color : str or tuple, optional, default: "orange"
The color of the rejection area
truncate_vlines_at_kde: bool
truncate_vlines_at_kde: bool, optional, default: False
true: cut off the vlines at the kde
false: continue kde lines across the plot
xmin : float
xmin : float, optional, default: None
The lower x-axis limit
xmax : float
xmax : float, optional, default: None
The upper x-axis limit
bw_factor : float, optional, default: 1.5
bandwidth (aka. smoothing parameter) of the kernel density estimate
Expand Down
4 changes: 2 additions & 2 deletions bayesflow/helper_classes.py
Original file line number Diff line number Diff line change
Expand Up @@ -162,7 +162,7 @@ class EarlyStopper:
def __init__(self, patience=5, tolerance=0.05):
"""
patience : int, optional, default: 4
patience : int, optional, default: 5
How many successive times the tolerance value is reached before triggering
an early stopping recommendation.
tolerance : float, optional, default: 0.05
Expand Down Expand Up @@ -769,7 +769,7 @@ def __init__(self, capacity_in_batches=500):
Parameters
----------
capacity_in_batches : int, optional, default: 50
capacity_in_batches : int, optional, default: 500
The capacity of the buffer in batches of simulations. Could potentially grow
very large, so make sure you pick a reasonable number!
"""
Expand Down
2 changes: 1 addition & 1 deletion bayesflow/helper_functions.py
Original file line number Diff line number Diff line change
Expand Up @@ -165,7 +165,7 @@ def backprop_step(input_dict, amortizer, optimizer, **kwargs):
Parameters
----------
input_dict : dict
The configured output of the genrative model
The configured output of the generative model
amortizer : tf.keras.Model
The custom amortizer. Needs to implement a compute_loss method.
optimizer : tf.keras.optimizers.Optimizer
Expand Down
4 changes: 2 additions & 2 deletions bayesflow/helper_networks.py
Original file line number Diff line number Diff line change
Expand Up @@ -179,7 +179,7 @@ def _inverse(self, target):


class Orthogonal(tf.keras.Model):
"""Imeplements a learnable orthogonal transformation according to [1]. Can be
"""Implements a learnable orthogonal transformation according to [1]. Can be
used as an alternative to a fixed ``Permutation`` layer.
[1] Kingma, D. P., & Dhariwal, P. (2018). Glow: Generative flow with invertible 1x1
Expand Down Expand Up @@ -357,7 +357,7 @@ def call(self, target, inverse=False):
If inverse=False: The transformed input and the corresponding Jacobian of the transformation,
v shape: (batch_size, inp_dim), log_det_J shape: (,)
target : tf.Tensor
If inverse=True: The inversly transformed targets, shape == target.shape
If inverse=True: The inversely transformed targets, shape == target.shape
Notes
-----
Expand Down
6 changes: 3 additions & 3 deletions bayesflow/inference_networks.py
Original file line number Diff line number Diff line change
Expand Up @@ -119,8 +119,8 @@ def __init__(
Optional data-dependent initialization for the internal ``ActNorm`` layers, as done in [5]. Could be helpful
for deep invertible networks.
use_soft_flow : bool, optional, default: False
Whether to perturb the taregt distribution (i.e., parameters) with small amount of independent
noise, as done in [2]. Could be helpful for degenrate distributions.
Whether to perturb the target distribution (i.e., parameters) with small amount of independent
noise, as done in [2]. Could be helpful for degenerate distributions.
soft_flow_bounds : tuple(float, float), optional, default: (1e-3, 5e-2)
The bounds of the continuous uniform distribution from which the noise scale would be sampled
at each iteration. Only relevant when ``use_soft_flow=True``.
Expand Down Expand Up @@ -178,7 +178,7 @@ def call(self, targets, condition, inverse=False, **kwargs):
return self.forward(targets, condition, **kwargs)

def forward(self, targets, condition, **kwargs):
"""Performs a forward pass though the chain."""
"""Performs a forward pass through the chain."""

# Add noise to target if using SoftFlow, use explicitly
# not in call(), since methods are public
Expand Down
4 changes: 2 additions & 2 deletions bayesflow/sensitivity.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,11 +53,11 @@ def misspecification_experiment(
second_config_dict : dict
Configuration for the second misspecification factor
fields: name (str), values (1D np.ndarray)
error_function : callable, default: bayesflow.computational_utilities.aggregated_rmse
error_function : callable, default: bayesflow.computational_utilities.aggregated_error
A callable that computes an error metric on the approximate posterior samples
n_posterior_samples : int, optional, default: 500
Number of samples from the approximate posterior per data set
n_sim : int, optional, default: 100
n_sim : int, optional, default: 200
Number of simulated data sets per configuration
configurator : callable or None, optional, default: None
An optional configurator for the misspecified simulations.
Expand Down
8 changes: 4 additions & 4 deletions bayesflow/simulation.py
Original file line number Diff line number Diff line change
Expand Up @@ -405,7 +405,7 @@ def __init__(
Examples
--------
Varying number of local factors (e.g., groups, participants) between 1 and 100::
Varying number of local factors (e.g., groups, participants) between 1 and 100:
def draw_hyper():
# Draw location for 2D conditional prior
Expand Down Expand Up @@ -529,7 +529,7 @@ def __init__(self, batch_simulator_fun=None, simulator_fun=None, context_generat
vectors and context variables and will pass the latter directly to the function. Power users should attempt to provide
optimized batched simulators.
If a ``simulator_fun`` is provided, the interface will assume thatthe function operates on single parameter vectors and
If a ``simulator_fun`` is provided, the interface will assume that the function operates on single parameter vectors and
context variables and will wrap the simulator internally to allow batched functionality.
Parameters
Expand Down Expand Up @@ -815,9 +815,9 @@ def plot_pushforward(
funcs_labels : list of str
A list of labels for the functions in funcs_list.
The default behavior without user input is to call the functions "Aggregator function 1, Aggregator function 2, etc."
batch_size : int
batch_size : int, optional, default: 1000
The number of prior draws to generate (and then create and visualizes simulations from)
show_raw_sims : bool
show_raw_sims : bool, optional, default: True
Flag determining whether or not a plot of 49 raw (i.e. unaggregated) simulations is generated.
Useful for very general data exploration.
Expand Down

0 comments on commit 7da2016

Please sign in to comment.