Documentation
¶
Overview ¶
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Index ¶
Constants ¶
This section is empty.
Variables ¶
var ( ErrNoClosing = errors.New("No closing quotation") ErrNoEscaped = errors.New("No escaped character") )
Functions ¶
Types ¶
type DefaultTokenizer ¶
type DefaultTokenizer struct{}
DefaultTokenizer implements a simple tokenizer like Unix shell.
func (*DefaultTokenizer) IsEscape ¶
func (t *DefaultTokenizer) IsEscape(r rune) bool
func (*DefaultTokenizer) IsEscapedQuote ¶
func (t *DefaultTokenizer) IsEscapedQuote(r rune) bool
func (*DefaultTokenizer) IsQuote ¶
func (t *DefaultTokenizer) IsQuote(r rune) bool
func (*DefaultTokenizer) IsWhitespace ¶
func (t *DefaultTokenizer) IsWhitespace(r rune) bool
func (*DefaultTokenizer) IsWord ¶
func (t *DefaultTokenizer) IsWord(r rune) bool
type Lexer ¶
type Lexer struct {
// contains filtered or unexported fields
}
Lexer represents a lexical analyzer.
func NewGitLexer ¶
NewGitLexer creates a new Lexer reading from io.Reader. This Lexer has a DefaultTokenizer according to posix and whitespacesplit rules.
func NewGitLexerString ¶
NewGitLexerString creates a new Lexer reading from a string. This Lexer has a DefaultTokenizer according to posix and whitespacesplit rules.
func NewLexer ¶
NewLexer creates a new Lexer reading from io.Reader. This Lexer has a DefaultTokenizer according to posix and whitespacesplit rules.
func NewLexerString ¶
NewLexerString creates a new Lexer reading from a string. This Lexer has a DefaultTokenizer according to posix and whitespacesplit rules.
func (*Lexer) SetTokenizer ¶
SetTokenizer sets a Tokenizer.