Time scales¶
Leap-second-correct conversion between the five GMAT time scales (A1, TAI,
UTC, TT, TDB). All conversion goes through
astropy.time.Time, which owns
the IERS leap-second table; gmat-run does not bundle leap-second data of its
own.
A1 is GMAT-specific — astropy does not recognise it. Per the GMAT Mathematical Specification, A1 leads TAI by a fixed 0.0343817 s; this module routes A1 through TAI by applying that offset before/after the astropy conversion.
This module is gated behind the [astropy] extra. Importing the module
without astropy installed is fine; calling the conversion functions raises a
clear ImportError pointing at the extra.
Quick reference¶
import pandas as pd
from gmat_run import Mission
from gmat_run.time import convert, convert_column
mission = Mission.load("flyby.script")
result = mission.run()
df = result.reports["ReportFile1"] # df.attrs["epoch_scales"] is set by promote_epochs
# Series-level: convert one column from its native scale to UTC.
df["Sat.TAIGregorian"] = convert(df["Sat.TAIGregorian"], "TAI", "UTC")
# DataFrame-level: convert and update df.attrs["epoch_scales"] in one call.
convert_column(df, "Sat.TAIGregorian", "UTC")
Parser-level convert_to¶
For the common case of "I want every epoch column on a single scale", the
parsers and promote_epochs take
a convert_to= keyword that runs the conversion in one call:
from gmat_run.parsers.reportfile import parse
# Mixed-scale ReportFile (TAIGregorian + UTCModJulian) → all UTC.
df = parse("flyby.report", convert_to="UTC")
assert all(scale == "UTC" for scale in df.attrs["epoch_scales"].values())
The same keyword works on every parser whose output carries an
epoch_scales attr:
gmat_run.parsers.reportfile.parsegmat_run.parsers.ephemeris.parse(CCSDS-OEM)gmat_run.parsers.stk_ephemeris.parsegmat_run.parsers.aem_ephemeris.parse
CCSDS-OEM and CCSDS-AEM permit TIME_SYSTEM values (UT1, GPS, TCG, …)
that fall outside the five GMAT scales. Calling these parsers with
convert_to= on such a file raises ValueError rather than silently
mis-converting; reach for the underlying convert
once you have a mapping you trust.
convert
¶
Convert an epoch Series from from_scale to to_scale.
Both scales must be one of "A1", "TAI", "UTC", "TT",
"TDB". Same-scale conversion returns a copy of series without
importing astropy.
Leap-second instants on the to_scale="UTC" path collapse to the
post-jump second — matching what GMAT does internally — at microsecond
precision. numpy.datetime64 cannot represent 23:59:60 so this
is the only sensible representation; non-leap-second rows keep full
datetime64[ns] precision even when a sibling row in the same
series lands on a leap second.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
series
|
Series[Timestamp]
|
|
required |
from_scale
|
str
|
Source GMAT time scale. |
required |
to_scale
|
str
|
Target GMAT time scale. |
required |
Returns:
| Type | Description |
|---|---|
Series[Timestamp]
|
A new |
Series[Timestamp]
|
representing the same physical instant in |
Raises:
| Type | Description |
|---|---|
ValueError
|
|
ImportError
|
|
convert_column
¶
Convert df[column] to to_scale and update df.attrs.
The source scale is read from df.attrs["epoch_scales"][column] —
populated by :func:gmat_run.parsers.epoch.promote_epochs. After
conversion df.attrs["epoch_scales"][column] is updated to
to_scale.
Idempotent when the source and target scales are equal: no astropy import, no data copy beyond the in-place attrs touch.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
df
|
DataFrame
|
DataFrame whose |
required |
column
|
str
|
Column name. Must be present in |
required |
to_scale
|
str
|
Target GMAT time scale. |
required |
Returns:
| Type | Description |
|---|---|
DataFrame
|
|
Raises:
| Type | Description |
|---|---|
ValueError
|
|
ImportError
|
|