Skip to content

Examples

End-to-end Jupyter notebooks that exercise the gmat-sweep API against a local GMAT install. Each notebook is committed with cleared cell outputs and re-executed in CI on every push, so the rendered docs always reflect the current code.

You can run them locally after pip install gmat-sweep[examples] (the extra pulls in matplotlib, distributed, and ray so the cluster-backend notebooks run on a laptop too).

  • Single-axis SMA scan — fifty runs across np.linspace(7000, 8000, 50) of Sat.SMA, parallel-dispatched through the default LocalJoblibPool, overlaid on a single altitude-vs-time plot.
  • Two-axis epoch × time-of-flight grid — a cartesian product over Sat.Epoch and a script-level Variable TOF, reshaped into a 2D matrix and contoured by per-run miss distance.
  • Surviving a kill — launch a sweep as a subprocess, send SIGINT mid-run, walk through inspecting the partial manifest with gmat-sweep show and reloading the partial DataFrame from disk, then complete the sweep with a programmatic Sweep.from_manifest(...).resume() call.
  • Monte Carlo dispersion — 1000-run Monte Carlo around a nominal injection burn over a four-axis perturbation cube (parking-orbit coast time and the three VNB delta-V components). Histogram of arrival miss distances, 3-sigma covariance ellipse in the (X, Y) plane, and a recipe demonstrating the determinism contract via expand_monte_carlo_to_run_specs.
  • Latin hypercube vs Monte Carlo — 64-run Latin hypercube alongside a 64-run plain Monte Carlo against the same four-axis injection perturbation. Pair plot of the unit-cube samples to make the stratification visible, and a side-by-side miss-distance histogram for the variance-reduction case.
  • Dask cluster recipe — 100-run Sat.SMA grid sweep dispatched through a distributed.LocalCluster with DaskPool. Same client API, same dashboard, same submit/await flow as a real dask.distributed cluster.
  • Ray autoscaling recipe — 100-run Monte Carlo against the notebook 04 fixture, dispatched through RayPool against a local ray.init(). Same task model as a real autoscaling Ray cluster.