robots

package
v1.0.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 23, 2026 License: MIT Imports: 5 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Checker

type Checker struct {
	// contains filtered or unexported fields
}

Checker evaluates robots.txt rules for AI user-agents.

func New

func New(f *fetcher.Fetcher) *Checker

New creates a Checker.

func (*Checker) IsAllowed

func (c *Checker) IsAllowed(ctx context.Context, rawURL string) bool

IsAllowed checks whether the given URL is allowed by robots.txt. Returns true if allowed (or if robots.txt is absent/unparseable).

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL