Documentation
¶
Overview ¶
Package bigquery provides implementation of BigQuery data warehouse
Index ¶
Constants ¶
View Source
const MapperName = "bigquery"
MapperName is the name of the BigQuery type mapper
Variables ¶
This section is empty.
Functions ¶
func NewBigQueryTableDriver ¶
func NewBigQueryTableDriver( db *bigquery.Client, dataset string, writer Writer, tableCreationTimeout time.Duration, ) warehouse.Driver
NewBigQueryTableDriver creates a new BigQuery table driver.
func NewFieldTypeMapper ¶
func NewFieldTypeMapper() warehouse.FieldTypeMapper[SpecificBigQueryType]
NewFieldTypeMapper creates a mapper that supports BigQuery types
Types ¶
type SpecificBigQueryType ¶
type SpecificBigQueryType struct {
FieldType bigquery.FieldType
Required bool
Repeated bool
FormatFunc func(SpecificBigQueryType) func(i any, m arrow.Metadata) (any, error)
// Schema holds the nested schema for RECORD types, nil for primitive types
Schema *bigquery.Schema
}
SpecificBigQueryType represents a BigQuery data type with its string representation and formatting function
type Writer ¶
type Writer interface {
Write(ctx context.Context, tableName string, schema *arrow.Schema, rows []map[string]any) error
}
Writer is an interface that represents a writer for BigQuery
func NewLoadJobWriter ¶
func NewLoadJobWriter( db *bigquery.Client, dataset string, queryTimeout time.Duration, fieldTypeMapper warehouse.FieldTypeMapper[SpecificBigQueryType], ) Writer
NewLoadJobWriter creates a new free tier compatible writer, using load jobs with NDJSON
func NewStreamingWriter ¶
func NewStreamingWriter( db *bigquery.Client, dataset string, queryTimeout time.Duration, fieldTypeMapper warehouse.FieldTypeMapper[SpecificBigQueryType], ) Writer
NewStreamingWriter creates a new non-free tier writer, using streaming insert
Click to show internal directories.
Click to hide internal directories.