Documentation
¶
Index ¶
- type Crawler
- func (c *Crawler) CrawlPublishers(publishers []common.Publisher) error
- func (c *Crawler) CrawlRepo(repoURL url.URL, publisher common.Publisher) error
- func (c *Crawler) ProcessRepo(repository common.Repository)
- func (c *Crawler) ProcessRepositories(repos chan common.Repository)
- func (c *Crawler) ScanPublisher(publisher common.Publisher)
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Crawler ¶
Crawler is a helper class representing a crawler.
func NewCrawler ¶
NewCrawler initializes a new Crawler object and connects to Elasticsearch (if dryRun == false).
func (*Crawler) CrawlPublishers ¶
CrawlPublishers processes a list of publishers.
func (*Crawler) ProcessRepo ¶
func (c *Crawler) ProcessRepo(repository common.Repository)
ProcessRepo looks for a publiccode.yml file in a repository, and if found it processes it.
func (*Crawler) ProcessRepositories ¶
func (c *Crawler) ProcessRepositories(repos chan common.Repository)
ProcessRepositories process the repositories channel, check the repo's publiccode.yml and send new data to the API if the publiccode.yml file is valid.
func (*Crawler) ScanPublisher ¶
ScanPublisher scans all the publisher' repositories and sends the ones with a valid publiccode.yml to the repositories channel.
Click to show internal directories.
Click to hide internal directories.