dbn_hist

package
v0.7.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jul 31, 2025 License: Apache-2.0 Imports: 12 Imported by: 0

Documentation

Index

Constants

View Source
const (
	Condition_Available = "available"
	Condition_Degraded  = "degraded"
	Condition_Pending   = "pending"
	Condition_Missing   = "missing"
	Condition_Intraday  = "intraday"
)

Variables

This section is empty.

Functions

func GetBillableSize

func GetBillableSize(apiKey string, metaParams MetadataQueryParams) (int, error)

Calls the Metadata API to get the billable size of a GetRange query. Returns the billable size or an error if any.

func GetCost

func GetCost(apiKey string, metaParams MetadataQueryParams) (float64, error)

Calls the Metadata API to get the cost estimate of a GetRange query. Returns the cost or an error if any.

func GetRange

func GetRange(apiKey string, jobParams SubmitJobParams) ([]byte, error)

GetRange makes a streaming request for timeseries data from Databento.

This method returns the byte array of the DBN stream.

# Errors This function returns an error when it fails to communicate with the Databento API or the API indicates there's an issue with the request.

func GetRecordCount

func GetRecordCount(apiKey string, metaParams MetadataQueryParams) (int, error)

Calls the Metadata API to get the record count of a GetRange query. Returns the record count or an error if any.

func ListDatasets

func ListDatasets(apiKey string, dateRange DateRange) ([]string, error)

Lists all available dataset codes on Databento.

# Errors This function returns an error when it fails to communicate with the Databento API or the API indicates there's an issue with the request.

func ListSchemas

func ListSchemas(apiKey string, dataset string) ([]string, error)

Lists all available schemas for the given `dataset`.

# Errors This function returns an error when it fails to communicate with the Databento API or the API indicates there's an issue with the request.

Types

type BatchFileDesc

type BatchFileDesc struct {
	Filename string            `json:"filename"` // The file name.
	Size     uint64            `json:"size"`     // The size of the file in bytes.
	Hash     string            `json:"hash"`     // The SHA256 hash of the file.
	Urls     map[string]string `json:"urls"`     // A map of download protocol to URL.
}

The file details for a batch job.

func ListFiles

func ListFiles(apiKey string, jobID string) ([]BatchFileDesc, error)

Lists all files associated with the batch job with ID `jobID`. Returns JobExpiredError if the response indicates the job has expired.

# Errors This function returns an error when it fails to communicate with the Databento API or the API indicates there's an issue with the request.

type BatchJob

type BatchJob struct {
	Id             string          `json:"id"`                         // The unique job ID.
	UserID         *string         `json:"user_id"`                    // The user ID of the user who submitted the job.
	BillID         *string         `json:"bill_id"`                    // The bill ID (for internal use).
	CostUSD        *float64        `json:"cost_usd"`                   // The cost of the job in US dollars. Will be `None` until the job is processed.
	Dataset        string          `json:"dataset"`                    // The dataset code.
	Symbols        string          `json:"symbols"`                    // The CSV list of symbols specified in the request.
	StypeIn        dbn.SType       `json:"stype_in"`                   // The symbology type of the input `symbols`.
	StypeOut       dbn.SType       `json:"stype_out"`                  // The symbology type of the output `symbols`.
	Schema         dbn.Schema      `json:"schema"`                     // The data record schema.
	Start          time.Time       `json:"start"`                      // The start of the request time range (inclusive).
	End            time.Time       `json:"end"`                        // The end of the request time range (exclusive).
	Limit          uint64          `json:"limit"`                      // The maximum number of records to return.
	Encoding       dbn.Encoding    `json:"encoding"`                   // The data encoding.
	Compression    dbn.Compression `json:"compression"`                // The data compression mode.
	PrettyPx       bool            `json:"pretty_px"`                  // If prices are formatted to the correct scale (using the fixed-precision scalar 1e-9).
	PrettyTs       bool            `json:"pretty_ts"`                  // If timestamps are formatted as ISO 8601 strings.
	MapSymbols     bool            `json:"map_symbols"`                // If a symbol field is included with each text-encoded record.
	SplitSymbols   bool            `json:"split_symbols"`              // If files are split by raw symbol.
	SplitDuration  string          `json:"split_duration"`             // The maximum time interval for an individual file before splitting into multiple files.
	SplitSize      uint64          `json:"split_size,omitempty"`       // The maximum size for an individual file before splitting into multiple files.
	Delivery       Delivery        `json:"delivery"`                   // The delivery mechanism of the batch data.
	RecordCount    uint64          `json:"record_count"`               // The number of data records (`None` until the job is processed).
	BilledSize     uint64          `json:"billed_size"`                // The size of the raw binary data used to process the batch job (used for billing purposes).
	ActualSize     uint64          `json:"actual_size"`                // The total size of the result of the batch job after splitting and compression.
	PackageSize    uint64          `json:"package_size"`               // The total size of the result of the batch job after any packaging (including metadata).
	State          JobState        `json:"state"`                      // The current status of the batch job.
	TsReceived     time.Time       `json:"ts_received,omitempty"`      //The timestamp of when Databento received the batch job.
	TsQueued       time.Time       `json:"ts_queued,omitempty"`        // The timestamp of when the batch job was queued.
	TsProcessStart time.Time       `json:"ts_process_start,omitempty"` // The timestamp of when the batch job began processing.
	TsProcessDone  time.Time       `json:"ts_process_done,omitempty"`  // The timestamp of when the batch job finished processing.
	TsExpiration   time.Time       `json:"ts_expiration,omitempty"`    // The timestamp of when the batch job will expire from the Download center.
}

BatchJob is the description of a submitted batch job.

func ListJobs

func ListJobs(apiKey string, stateFilter string, sinceYMD time.Time) ([]BatchJob, error)

Lists all jobs associated with the given state filter and 'since' date.

# Errors This function returns an error when it fails to communicate with the Databento API or the API indicates there's an issue with the request.

func SubmitJob

func SubmitJob(apiKey string, jobParams SubmitJobParams) (*BatchJob, error)

SubmitJob submits a new batch job and returns a description and identifiers for the job.

# Errors This function returns an error when it fails to communicate with the Databento API or the API indicates there's an issue with the request.

type ConditionDetail

type ConditionDetail struct {
	Date         string // The day of the described data, as an ISO 8601 date string
	Condition    string // The condition code describing the quality and availability of the data on the given day. Possible values are ConditionKind.
	LastModified string // The date when any schema in the dataset on the given day was last generated or modified, as an ISO 8601 date string.
}

func GetDatasetCondition

func GetDatasetCondition(apiKey string, dataset string, dateRange DateRange) ([]ConditionDetail, error)

Calls the Metadata API to get the condition of a dataset and date range. Returns ConditionDetails, or an error if any. Passing a zero time for Start is the beginning of the Dataset. Passing a zero time for End is the end of the Dataset.

type DateRange

type DateRange struct {
	// The start date (inclusive).
	Start time.Time `json:"start"`
	// The end date (exclusive).
	End time.Time `json:"end"`
}

A **half**-closed date interval with an inclusive start date and an exclusive end date.

func GetDatasetRange

func GetDatasetRange(apiKey string, dataset string) (DateRange, error)

Calls the Metadata API to get the date range of a dataset. Returns the DateRage, or an error if any.

type Delivery

type Delivery string
const (
	Delivery_Unknown  Delivery = ""
	Delivery_Download Delivery = "download"
	Delivery_S3       Delivery = "s3"
	Delivery_Disk     Delivery = "disk"
)

func DeliveryFromString

func DeliveryFromString(str string) (Delivery, error)

DeliveryFromString converts a string to an Delivery. Returns an error if the string is unknown.

func (Delivery) MarshalJSON

func (d Delivery) MarshalJSON() ([]byte, error)

func (Delivery) String

func (d Delivery) String() string

Returns the string representation of the Delivery, or empty string if unknown.

func (*Delivery) UnmarshalJSON

func (d *Delivery) UnmarshalJSON(data []byte) error

type FeedMode

type FeedMode uint8

A type of data feed.

const (
	// The historical batch data feed.
	FeedMode_Historical FeedMode = 0
	/// The historical streaming data feed.
	FeedMode_HistoricalStreaming FeedMode = 1
	/// The Live data feed for real-time and intraday historical.
	FeedMode_Live FeedMode = 2
)

func FeedModeFromString

func FeedModeFromString(str string) (FeedMode, error)

FeedModeFromString converts a string to an FeedMode. Returns an error if the string is unknown. Possible string values: historical, historical-streaming, live.

func (FeedMode) MarshalJSON

func (f FeedMode) MarshalJSON() ([]byte, error)

func (FeedMode) String

func (f FeedMode) String() string

Returns the string representation of the FeedMode, or empty string if unknown.

func (*FeedMode) UnmarshalJSON

func (f *FeedMode) UnmarshalJSON(data []byte) error

type FieldDetail

type FieldDetail struct {
	// The field name.
	Name string `json:"name,omitempty"`
	// The field type name.
	TypeName string `json:"type_name,omitempty"`
}

The details about a field in a schema.

func ListFields

func ListFields(apiKey string, encoding dbn.Encoding, schema dbn.Schema) ([]FieldDetail, error)

Lists all fields for a schema and encoding.

# Errors This function returns an error when it fails to communicate with the Databento API or the API indicates there's an issue with the request.

type GetRangeParams

type GetRangeParams struct {
	Dataset       string                   `json:"dataset"`         // The dataset code.
	Symbols       []string                 `json:"symbols"`         // The symbols to filter for.
	Schema        dbn.Schema               `json:"schema"`          // The data record schema.
	DateTimeRange DateRange                `json:"date_time_range"` // The request time range.
	StypeIn       dbn.SType                `json:"stype_in"`        // The symbology type of the input `symbols`. Defaults to [`RawSymbol`](dbn::enums::SType::RawSymbol).
	StypeOut      dbn.SType                `json:"stype_out"`       // The symbology type of the output `symbols`. Defaults to [`InstrumentId`](dbn::enums::SType::InstrumentId).
	Limit         uint64                   `json:"limit"`           // The optional maximum number of records to return. Defaults to no limit.
	UpgradePolicy dbn.VersionUpgradePolicy `json:"upgrade_policy"`  // How to decode DBN from prior versions. Defaults to upgrade.
}

GetRangeParams is the parameters for [`TimeseriesClient::get_range()`]. Use NewGetRangeParams to get an instance type with all the preset defaults.

type JobExpiredError

type JobExpiredError struct {
	JobID string
}

JobExpiredError is returned when an expired Job has its file listed

func (JobExpiredError) Error

func (e JobExpiredError) Error() string

type JobState

type JobState string
const (
	JobState_Unknown    JobState = ""
	JobState_Received   JobState = "received"
	JobState_Queued     JobState = "queued"
	JobState_Processing JobState = "processing"
	JobState_Done       JobState = "done"
	JobState_Expired    JobState = "expired"
)

func JobStateFromString

func JobStateFromString(str string) (JobState, error)

JobStateFromString converts a string to an JobState. Returns an error if the string is unknown.

func (JobState) MarshalJSON

func (j JobState) MarshalJSON() ([]byte, error)

func (JobState) String

func (j JobState) String() string

Returns the string representation of the JobState, or empty string if unknown.

func (*JobState) UnmarshalJSON

func (j *JobState) UnmarshalJSON(data []byte) error

type MappingInterval

type MappingInterval struct {
	StartDate string `json:"d0"` // The UTC start date of interval (inclusive).
	EndDate   string `json:"d1"` // The UTC end date of interval (exclusive).
	Symbol    string `json:"s"`  // The resolved symbol for this interval.
}

The resolved symbol for a date range.

type MetadataQueryParams

type MetadataQueryParams struct {
	// The dataset code.
	Dataset   string    `json:"dataset,omitempty"`  // The dataset code. Must be one of the values from ListDatasets.
	Symbols   []string  `json:"symbols"`            // The product symbols to filter for. Takes up to 2,000 symbols per request. If `ALL_SYMBOLS` or not specified then will be for all symbols.
	Schema    string    `json:"schema,omitempty"`   // The data record schema. Must be one of the values from ListSchemas.
	DateRange DateRange `json:"date_range"`         // The date range of the request.
	Mode      FeedMode  `json:"mode,omitempty"`     // The data feed mode of the request. Must be one of 'historical', 'historical-streaming', or 'live'.
	StypeIn   dbn.SType `json:"stype_in,omitempty"` // The symbology type of input symbols. Must be one of 'raw_symbol', 'instrument_id', 'parent', or 'continuous'.
	Limit     int32     `json:"limit,omitempty"`    // The maximum number of records to return. 0 and negative (-1) means no limit.
}

MetadataQueryParams is the common request structure for several Metadata API queries

func (*MetadataQueryParams) ApplyToURLValues

func (metaParams *MetadataQueryParams) ApplyToURLValues(params *url.Values) error

ApplyToURLValues fills a url.Values with the MetadataQueryParams per the Metadata API spec. Returns any error.

type PublisherDetail

type PublisherDetail struct {
	// The publisher ID assigned by Databento, which denotes the dataset and venue.
	PublisherID uint16 `json:"publisher_id,omitempty"`
	// The dataset code for the publisher.
	Dataset string `json:"dataset,omitempty"`
	// The venue for the publisher.
	Venue string `json:"venue,omitempty"`
	// The publisher description.
	Description string `json:"description,omitempty"`
}

The details about a publisher.

func ListPublishers

func ListPublishers(apiKey string) ([]PublisherDetail, error)

Lists the details of all publishers.

# Errors This function returns an error when it fails to communicate with the Databento API.

type RequestError

type RequestError struct {
	Case       string `json:"case"`
	Message    string `json:"message"`
	StatusCode int    `json:"status_code"`
	Docs       string `json:"docs,omitempty"`
	Payload    string `json:"payload,omitempty"`
}

type RequestErrorResp

type RequestErrorResp struct {
	Detail RequestError `json:"detail"`
}

type Resolution

type Resolution struct {
	// A mapping from input symbol to a list of resolved symbols in the output symbology.
	Mappings map[string][]MappingInterval `json:"result"`
	// A list of symbols that were resolved for part, but not all of the date range from the request.
	Partial []string `json:"partial"`
	// A list of symbols that were not resolved.
	NotFound []string `json:"not_found"`
	// The input symbology type, in string form.
	StypeIn dbn.SType `json:"stype_in"`
	// The output symbology type, in string form.
	StypeOut dbn.SType `json:"stype_out"`
	// A message from the server.
	Message string `json:"message"`
	// The status code of the response.
	Status int `json:"status"`
}

A symbology resolution from one symbology type to another.

func SymbologyResolve

func SymbologyResolve(apiKey string, params ResolveParams) (*Resolution, error)

type ResolveParams

type ResolveParams struct {
	// The dataset code.
	Dataset string `json:"dataset"`
	// The symbols to resolve.
	Symbols []string `json:"symbols"`
	// The symbology type of the input `symbols`. Defaults to SType_RawSymbol
	StypeIn dbn.SType `json:"stype_in"`
	// The symbology type of the output `symbols`. Defaults to SType_InstrumentId
	StypeOut dbn.SType `json:"stype_out"`
	// The date range of the resolution.
	DateRange DateRange `json:"date_range"`
}

https://github.com/databento/databento-rs/blob/main/src/historical/symbology.rs The parameters for [`SymbologyClient::resolve()`]. Use [`ResolveParams::builder()`] to get a builder type with all the preset defaults.

type SubmitJobParams

type SubmitJobParams struct {
	Dataset       string          `json:"dataset"`              // The dataset code.
	Symbols       string          `json:"symbols"`              // The symbols to filter for.
	Schema        dbn.Schema      `json:"schema"`               // The data record schema.
	DateRange     DateRange       `json:"date_time_range"`      // The request time range.
	Encoding      dbn.Encoding    `json:"encoding"`             // The data encoding. Defaults to [`Dbn`](Encoding::Dbn).
	Compression   dbn.Compression `json:"compression"`          // The data compression mode. Defaults to [`ZStd`](Compression::ZStd).
	PrettyPx      bool            `json:"pretty_px"`            // If `true`, prices will be formatted to the correct scale (using the fixed-  precision scalar 1e-9). Only valid for [`Encoding::Csv`] and [`Encoding::Json`].
	PrettyTs      bool            `json:"pretty_ts"`            // If `true`, timestamps will be formatted as ISO 8601 strings. Only valid for [`Encoding::Csv`] and [`Encoding::Json`].
	MapSymbols    bool            `json:"map_symbols"`          // If `true`, a symbol field will be included with each text-encoded record, reducing the need to look at the `symbology.json`. Only valid for [`Encoding::Csv`] and [`Encoding::Json`].
	SplitSymbols  bool            `json:"split_symbols"`        // If `true`, files will be split by raw symbol. Cannot be requested with [`Symbols::All`].
	SplitDuration string          `json:"split_duration"`       // The maximum time duration before batched data is split into multiple files. Defaults to [`Day`](SplitDuration::Day).
	SplitSize     uint64          `json:"split_size,omitempty"` // The optional maximum size (in bytes) of each batched data file before being split. Defaults to `None`.
	Delivery      Delivery        `json:"delivery"`             // The delivery mechanism for the batched data files once processed. Defaults to [`Download`](Delivery::Download).
	StypeIn       dbn.SType       `json:"stype_in"`             // The symbology type of the input `symbols`. Defaults to [`RawSymbol`](dbn::enums::SType::RawSymbol).
	StypeOut      dbn.SType       `json:"stype_out"`            // The symbology type of the output `symbols`. Defaults to [`InstrumentId`](dbn::enums::SType::InstrumentId).
	Limit         uint64          `json:"limit,omitempty"`      // The optional maximum number of records to return. Defaults to no limit.
}

SubmitJobParams are te parameters for [`BatchClient::submit_job()`]. Use [`SubmitJobParams::builder()`] to get a builder type with all the preset defaults.

func (*SubmitJobParams) ApplyToURLValues

func (jobParams *SubmitJobParams) ApplyToURLValues(params *url.Values) error

ApplyToURLValues fills a url.Values with the SubmitJobParams per the Batch API spec. Returns any error.

type UnitPricesForMode

type UnitPricesForMode struct {
	/// The data feed mode.
	Mode FeedMode `json:"mode,omitempty"`
	/// The unit prices in US dollars by data record schema.
	UnitPrices map[string]float64 `json:"unit_prices,omitempty"`
}

The unit prices for a particular [`FeedMode`].

func ListUnitPrices

func ListUnitPrices(apiKey string, dataset string) ([]UnitPricesForMode, error)

Lists unit prices for each data schema and feed mode in US dollars per gigabyte.

# Errors This function returns an error when it fails to communicate with the Databento API or the API indicates there's an issue with the request.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL