Documentation
¶
Index ¶
- Constants
- func GetSink() api.Sink
- func GetSource() api.Source
- type FileType
- type Source
- func (fs *Source) Close(ctx api.StreamContext) error
- func (fs *Source) Connect(ctx api.StreamContext, sch api.StatusChangeHandler) error
- func (fs *Source) Info() (i model.NodeInfo)
- func (fs *Source) Load(ctx api.StreamContext, ingest api.TupleIngest, ingestError api.ErrorIngest)
- func (fs *Source) Provision(ctx api.StreamContext, props map[string]any) error
- func (fs *Source) Pull(ctx api.StreamContext, _ time.Time, ingest api.TupleIngest, ...)
- func (fs *Source) SetEofIngest(eof api.EOFIngest)
- type SourceConfig
Constants ¶
View Source
const ( GZIP = "gzip" ZSTD = "zstd" )
Variables ¶
This section is empty.
Functions ¶
Types ¶
type Source ¶
type Source struct {
// contains filtered or unexported fields
}
Source load data from file system. Depending on file types, it may read line by line like lines, csv. Otherwise, it reads the file as a whole and send to company reader node to read and split. The planner need to plan according to the file type.
func (*Source) Connect ¶
func (fs *Source) Connect(ctx api.StreamContext, sch api.StatusChangeHandler) error
func (*Source) Load ¶
func (fs *Source) Load(ctx api.StreamContext, ingest api.TupleIngest, ingestError api.ErrorIngest)
func (*Source) Pull ¶
func (fs *Source) Pull(ctx api.StreamContext, _ time.Time, ingest api.TupleIngest, ingestError api.ErrorIngest)
Pull file source may ingest bytes or tuple For stream source, it ingest one line For batch source, it ingest the whole file, thus it need a reader node to coordinate and read the content into lines/array
func (*Source) SetEofIngest ¶
type SourceConfig ¶
type SourceConfig struct {
FileName string `json:"datasource"`
FileType string `json:"fileType"`
Path string `json:"path"`
Interval cast.DurationConf `json:"interval"`
IsTable bool `json:"isTable"`
Parallel bool `json:"parallel"`
SendInterval cast.DurationConf `json:"sendInterval"`
ActionAfterRead int `json:"actionAfterRead"`
MoveTo string `json:"moveTo"`
IgnoreStartLines int `json:"ignoreStartLines"`
IgnoreEndLines int `json:"ignoreEndLines"`
// Only use for planning
Decompression string `json:"decompression"`
}
Click to show internal directories.
Click to hide internal directories.