Documentation
¶
Index ¶
- Constants
- func GetBillableSize(apiKey string, metaParams MetadataQueryParams) (int, error)
- func GetCost(apiKey string, metaParams MetadataQueryParams) (float64, error)
- func GetRange(apiKey string, jobParams SubmitJobParams) ([]byte, error)
- func GetRecordCount(apiKey string, metaParams MetadataQueryParams) (int, error)
- func ListDatasets(apiKey string, dateRange DateRange) ([]string, error)
- func ListSchemas(apiKey string, dataset string) ([]string, error)
- type BatchFileDesc
- type BatchJob
- type ConditionDetail
- type DateRange
- type Delivery
- type FeedMode
- type FieldDetail
- type GetRangeParams
- type JobExpiredError
- type JobState
- type MappingInterval
- type MetadataQueryParams
- type PublisherDetail
- type RequestError
- type RequestErrorResp
- type Resolution
- type ResolveParams
- type SubmitJobParams
- type UnitPricesForMode
Constants ¶
const ( Condition_Available = "available" Condition_Degraded = "degraded" Condition_Pending = "pending" Condition_Missing = "missing" Condition_Intraday = "intraday" )
Variables ¶
This section is empty.
Functions ¶
func GetBillableSize ¶ added in v0.0.11
func GetBillableSize(apiKey string, metaParams MetadataQueryParams) (int, error)
Calls the Metadata API to get the billable size of a GetRange query. Returns the billable size or an error if any.
func GetCost ¶ added in v0.0.11
func GetCost(apiKey string, metaParams MetadataQueryParams) (float64, error)
Calls the Metadata API to get the cost estimate of a GetRange query. Returns the cost or an error if any.
func GetRange ¶ added in v0.0.13
func GetRange(apiKey string, jobParams SubmitJobParams) ([]byte, error)
GetRange makes a streaming request for timeseries data from Databento.
This method returns the byte array of the DBN stream.
# Errors This function returns an error when it fails to communicate with the Databento API or the API indicates there's an issue with the request.
func GetRecordCount ¶ added in v0.0.11
func GetRecordCount(apiKey string, metaParams MetadataQueryParams) (int, error)
Calls the Metadata API to get the record count of a GetRange query. Returns the record count or an error if any.
func ListDatasets ¶
Lists all available dataset codes on Databento.
# Errors This function returns an error when it fails to communicate with the Databento API or the API indicates there's an issue with the request.
Types ¶
type BatchFileDesc ¶ added in v0.0.13
type BatchFileDesc struct {
Filename string `json:"filename"` // The file name.
Size uint64 `json:"size"` // The size of the file in bytes.
Hash string `json:"hash"` // The SHA256 hash of the file.
Urls map[string]string `json:"urls"` // A map of download protocol to URL.
}
The file details for a batch job.
func ListFiles ¶ added in v0.0.13
func ListFiles(apiKey string, jobID string) ([]BatchFileDesc, error)
Lists all files associated with the batch job with ID `jobID`. Returns JobExpiredError if the response indicates the job has expired.
# Errors This function returns an error when it fails to communicate with the Databento API or the API indicates there's an issue with the request.
type BatchJob ¶ added in v0.0.13
type BatchJob struct {
Id string `json:"id"` // The unique job ID.
UserID *string `json:"user_id"` // The user ID of the user who submitted the job.
BillID *string `json:"bill_id"` // The bill ID (for internal use).
CostUSD *float64 `json:"cost_usd"` // The cost of the job in US dollars. Will be `None` until the job is processed.
Dataset string `json:"dataset"` // The dataset code.
Symbols string `json:"symbols"` // The CSV list of symbols specified in the request.
StypeIn dbn.SType `json:"stype_in"` // The symbology type of the input `symbols`.
StypeOut dbn.SType `json:"stype_out"` // The symbology type of the output `symbols`.
Schema dbn.Schema `json:"schema"` // The data record schema.
Start time.Time `json:"start"` // The start of the request time range (inclusive).
End time.Time `json:"end"` // The end of the request time range (exclusive).
Limit uint64 `json:"limit"` // The maximum number of records to return.
Encoding dbn.Encoding `json:"encoding"` // The data encoding.
Compression dbn.Compression `json:"compression"` // The data compression mode.
PrettyPx bool `json:"pretty_px"` // If prices are formatted to the correct scale (using the fixed-precision scalar 1e-9).
PrettyTs bool `json:"pretty_ts"` // If timestamps are formatted as ISO 8601 strings.
MapSymbols bool `json:"map_symbols"` // If a symbol field is included with each text-encoded record.
SplitSymbols bool `json:"split_symbols"` // If files are split by raw symbol.
SplitDuration string `json:"split_duration"` // The maximum time interval for an individual file before splitting into multiple files.
SplitSize uint64 `json:"split_size,omitempty"` // The maximum size for an individual file before splitting into multiple files.
Delivery Delivery `json:"delivery"` // The delivery mechanism of the batch data.
RecordCount uint64 `json:"record_count"` // The number of data records (`None` until the job is processed).
BilledSize uint64 `json:"billed_size"` // The size of the raw binary data used to process the batch job (used for billing purposes).
ActualSize uint64 `json:"actual_size"` // The total size of the result of the batch job after splitting and compression.
PackageSize uint64 `json:"package_size"` // The total size of the result of the batch job after any packaging (including metadata).
State JobState `json:"state"` // The current status of the batch job.
TsReceived time.Time `json:"ts_received,omitempty"` //The timestamp of when Databento received the batch job.
TsQueued time.Time `json:"ts_queued,omitempty"` // The timestamp of when the batch job was queued.
TsProcessStart time.Time `json:"ts_process_start,omitempty"` // The timestamp of when the batch job began processing.
TsProcessDone time.Time `json:"ts_process_done,omitempty"` // The timestamp of when the batch job finished processing.
TsExpiration time.Time `json:"ts_expiration,omitempty"` // The timestamp of when the batch job will expire from the Download center.
}
BatchJob is the description of a submitted batch job.
func ListJobs ¶ added in v0.0.13
Lists all jobs associated with the given state filter and 'since' date.
# Errors This function returns an error when it fails to communicate with the Databento API or the API indicates there's an issue with the request.
func SubmitJob ¶ added in v0.0.13
func SubmitJob(apiKey string, jobParams SubmitJobParams) (*BatchJob, error)
SubmitJob submits a new batch job and returns a description and identifiers for the job.
# Errors This function returns an error when it fails to communicate with the Databento API or the API indicates there's an issue with the request.
type ConditionDetail ¶ added in v0.0.11
type ConditionDetail struct {
Date string `json:"date"` // The day of the described data, as an ISO 8601 date string
Condition string `json:"condition"` // The condition code describing the quality and availability of the data on the given day. Possible values are ConditionKind.
LastModified string `json:"last_modified_date"` // The date when any schema in the dataset on the given day was last generated or modified, as an ISO 8601 date string.
}
func GetDatasetCondition ¶ added in v0.0.11
func GetDatasetCondition(apiKey string, dataset string, dateRange DateRange) ([]ConditionDetail, error)
Calls the Metadata API to get the condition of a dataset and date range. Returns ConditionDetails, or an error if any. Passing a zero time for Start is the beginning of the Dataset. Passing a zero time for End is the end of the Dataset.
type DateRange ¶
type DateRange struct {
// The start date (inclusive).
Start time.Time `json:"start"`
// The end date (exclusive).
End time.Time `json:"end"`
}
A **half**-closed date interval with an inclusive start date and an exclusive end date.
type Delivery ¶ added in v0.0.13
type Delivery string
func DeliveryFromString ¶ added in v0.0.13
DeliveryFromString converts a string to an Delivery. Returns an error if the string is unknown.
func (Delivery) MarshalJSON ¶ added in v0.0.13
func (Delivery) String ¶ added in v0.0.13
Returns the string representation of the Delivery, or empty string if unknown.
func (*Delivery) UnmarshalJSON ¶ added in v0.0.13
type FeedMode ¶
type FeedMode uint8
A type of data feed.
func FeedModeFromString ¶ added in v0.1.0
FeedModeFromString converts a string to an FeedMode. Returns an error if the string is unknown. Possible string values: historical, historical-streaming, live.
func (FeedMode) MarshalJSON ¶ added in v0.1.0
func (FeedMode) String ¶ added in v0.0.11
Returns the string representation of the FeedMode, or empty string if unknown.
func (*FeedMode) UnmarshalJSON ¶ added in v0.1.0
type FieldDetail ¶
type FieldDetail struct {
// The field name.
Name string `json:"name,omitempty"`
// The field type name.
TypeName string `json:"type,omitempty"`
}
The details about a field in a schema.
func ListFields ¶
Lists all fields for a schema and encoding.
# Errors This function returns an error when it fails to communicate with the Databento API or the API indicates there's an issue with the request.
type GetRangeParams ¶ added in v0.0.13
type GetRangeParams struct {
Dataset string `json:"dataset"` // The dataset code.
Symbols []string `json:"symbols"` // The symbols to filter for.
Schema dbn.Schema `json:"schema"` // The data record schema.
DateTimeRange DateRange `json:"date_time_range"` // The request time range.
StypeIn dbn.SType `json:"stype_in"` // The symbology type of the input `symbols`. Defaults to [`RawSymbol`](dbn::enums::SType::RawSymbol).
StypeOut dbn.SType `json:"stype_out"` // The symbology type of the output `symbols`. Defaults to [`InstrumentId`](dbn::enums::SType::InstrumentId).
Limit uint64 `json:"limit"` // The optional maximum number of records to return. Defaults to no limit.
UpgradePolicy dbn.VersionUpgradePolicy `json:"upgrade_policy"` // How to decode DBN from prior versions. Defaults to upgrade.
}
GetRangeParams is the parameters for [`TimeseriesClient::get_range()`]. Use NewGetRangeParams to get an instance type with all the preset defaults.
type JobExpiredError ¶ added in v0.1.0
type JobExpiredError struct {
JobID string
}
JobExpiredError is returned when an expired Job has its file listed
func (JobExpiredError) Error ¶ added in v0.1.0
func (e JobExpiredError) Error() string
type JobState ¶ added in v0.0.13
type JobState string
func JobStateFromString ¶ added in v0.0.13
JobStateFromString converts a string to an JobState. Returns an error if the string is unknown.
func (JobState) MarshalJSON ¶ added in v0.0.13
func (JobState) String ¶ added in v0.0.13
Returns the string representation of the JobState, or empty string if unknown.
func (*JobState) UnmarshalJSON ¶ added in v0.0.13
type MappingInterval ¶
type MappingInterval struct {
StartDate string `json:"d0"` // The UTC start date of interval (inclusive).
EndDate string `json:"d1"` // The UTC end date of interval (exclusive).
Symbol string `json:"s"` // The resolved symbol for this interval.
}
The resolved symbol for a date range.
type MetadataQueryParams ¶ added in v0.0.11
type MetadataQueryParams struct {
// The dataset code.
Dataset string `json:"dataset,omitempty"` // The dataset code. Must be one of the values from ListDatasets.
Symbols []string `json:"symbols"` // The product symbols to filter for. Takes up to 2,000 symbols per request. If `ALL_SYMBOLS` or not specified then will be for all symbols.
Schema string `json:"schema,omitempty"` // The data record schema. Must be one of the values from ListSchemas.
DateRange DateRange `json:"date_range"` // The date range of the request.
Mode FeedMode `json:"mode"` // The data feed mode of the request. Must be one of 'historical', 'historical-streaming', or 'live'.
StypeIn dbn.SType `json:"stype_in,omitempty"` // The symbology type of input symbols. Must be one of 'raw_symbol', 'instrument_id', 'parent', or 'continuous'.
Limit int32 `json:"limit,omitempty"` // The maximum number of records to return. 0 and negative (-1) means no limit.
}
MetadataQueryParams is the common request structure for several Metadata API queries
func (*MetadataQueryParams) ApplyToURLValues ¶ added in v0.0.13
func (metaParams *MetadataQueryParams) ApplyToURLValues(params *url.Values) error
ApplyToURLValues fills a url.Values with the MetadataQueryParams per the Metadata API spec. Returns any error.
type PublisherDetail ¶
type PublisherDetail struct {
// The publisher ID assigned by Databento, which denotes the dataset and venue.
PublisherID uint16 `json:"publisher_id,omitempty"`
// The dataset code for the publisher.
Dataset string `json:"dataset,omitempty"`
// The venue for the publisher.
Venue string `json:"venue,omitempty"`
// The publisher description.
Description string `json:"description,omitempty"`
}
The details about a publisher.
func ListPublishers ¶
func ListPublishers(apiKey string) ([]PublisherDetail, error)
Lists the details of all publishers.
# Errors This function returns an error when it fails to communicate with the Databento API.
type RequestError ¶
type RequestErrorResp ¶
type RequestErrorResp struct {
Detail RequestError `json:"detail"`
}
type Resolution ¶
type Resolution struct {
// A mapping from input symbol to a list of resolved symbols in the output symbology.
Mappings map[string][]MappingInterval `json:"result"`
// A list of symbols that were resolved for part, but not all of the date range from the request.
Partial []string `json:"partial"`
// A list of symbols that were not resolved.
NotFound []string `json:"not_found"`
// The input symbology type, in string form.
StypeIn dbn.SType `json:"stype_in"`
// The output symbology type, in string form.
StypeOut dbn.SType `json:"stype_out"`
// A message from the server.
Message string `json:"message"`
// The status code of the response.
Status int `json:"status"`
}
A symbology resolution from one symbology type to another.
func SymbologyResolve ¶
func SymbologyResolve(apiKey string, params ResolveParams) (*Resolution, error)
type ResolveParams ¶
type ResolveParams struct {
// The dataset code.
Dataset string `json:"dataset"`
// The symbols to resolve.
Symbols []string `json:"symbols"`
// The symbology type of the input `symbols`. Defaults to SType_RawSymbol
StypeIn dbn.SType `json:"stype_in"`
// The symbology type of the output `symbols`. Defaults to SType_InstrumentId
StypeOut dbn.SType `json:"stype_out"`
// The date range of the resolution.
DateRange DateRange `json:"date_range"`
}
https://github.com/databento/databento-rs/blob/main/src/historical/symbology.rs The parameters for [`SymbologyClient::resolve()`]. Use [`ResolveParams::builder()`] to get a builder type with all the preset defaults.
type SubmitJobParams ¶ added in v0.0.13
type SubmitJobParams struct {
Dataset string `json:"dataset"` // The dataset code.
Symbols string `json:"symbols"` // The symbols to filter for.
Schema dbn.Schema `json:"schema"` // The data record schema.
DateRange DateRange `json:"date_time_range"` // The request time range.
Encoding dbn.Encoding `json:"encoding"` // The data encoding. Defaults to [`Dbn`](Encoding::Dbn).
Compression dbn.Compression `json:"compression"` // The data compression mode. Defaults to [`ZStd`](Compression::ZStd).
PrettyPx bool `json:"pretty_px"` // If `true`, prices will be formatted to the correct scale (using the fixed- precision scalar 1e-9). Only valid for [`Encoding::Csv`] and [`Encoding::Json`].
PrettyTs bool `json:"pretty_ts"` // If `true`, timestamps will be formatted as ISO 8601 strings. Only valid for [`Encoding::Csv`] and [`Encoding::Json`].
MapSymbols bool `json:"map_symbols"` // If `true`, a symbol field will be included with each text-encoded record, reducing the need to look at the `symbology.json`. Only valid for [`Encoding::Csv`] and [`Encoding::Json`].
SplitSymbols bool `json:"split_symbols"` // If `true`, files will be split by raw symbol. Cannot be requested with [`Symbols::All`].
SplitDuration string `json:"split_duration"` // The maximum time duration before batched data is split into multiple files. Defaults to [`Day`](SplitDuration::Day).
SplitSize uint64 `json:"split_size,omitempty"` // The optional maximum size (in bytes) of each batched data file before being split. Defaults to `None`.
Delivery Delivery `json:"delivery"` // The delivery mechanism for the batched data files once processed. Defaults to [`Download`](Delivery::Download).
StypeIn dbn.SType `json:"stype_in"` // The symbology type of the input `symbols`. Defaults to [`RawSymbol`](dbn::enums::SType::RawSymbol).
StypeOut dbn.SType `json:"stype_out"` // The symbology type of the output `symbols`. Defaults to [`InstrumentId`](dbn::enums::SType::InstrumentId).
Limit uint64 `json:"limit,omitempty"` // The optional maximum number of records to return. Defaults to no limit.
}
SubmitJobParams are te parameters for [`BatchClient::submit_job()`]. Use [`SubmitJobParams::builder()`] to get a builder type with all the preset defaults.
func (*SubmitJobParams) ApplyToURLValues ¶ added in v0.0.13
func (jobParams *SubmitJobParams) ApplyToURLValues(params *url.Values) error
ApplyToURLValues fills a url.Values with the SubmitJobParams per the Batch API spec. Returns any error.
type UnitPricesForMode ¶
type UnitPricesForMode struct {
/// The data feed mode.
Mode FeedMode `json:"mode"`
/// The unit prices in US dollars by data record schema.
UnitPrices map[string]float64 `json:"unit_prices,omitempty"`
}
The unit prices for a particular [`FeedMode`].
func ListUnitPrices ¶
func ListUnitPrices(apiKey string, dataset string) ([]UnitPricesForMode, error)
Lists unit prices for each data schema and feed mode in US dollars per gigabyte.
# Errors This function returns an error when it fails to communicate with the Databento API or the API indicates there's an issue with the request.