Documentation
¶
Index ¶
- Variables
- func IsParsed(v any, trm ReplacedToken) bool
- type Generator
- func (c *Generator) Config() *config.GenVarsConfig
- func (c *Generator) DiscoverTokens(text string) (NormalizedTokenSafe, error)
- func (c *Generator) Generate(tokens []string) (ReplacedToken, error)
- func (c *Generator) NormalizeRawToken(rtm *RawTokenConfig) NormalizedTokenSafe
- func (c *Generator) WithConfig(cfg *config.GenVarsConfig) *Generator
- func (c *Generator) WithStores(sm storeIface) *Generator
- type NormalizedToken
- type NormalizedTokenSafe
- type Opts
- type RawTokenConfig
- type ReplacedToken
- type TokenResponse
Constants ¶
This section is empty.
Variables ¶
var ErrProvidersNotFound = errors.New("providers not initialised")
var ErrTokenDiscovery = errors.New("failed to discover tokens")
var ErrTokenNotFound = errors.New("token not found")
Functions ¶
func IsParsed ¶
func IsParsed(v any, trm ReplacedToken) bool
IsParsed will try to parse the return found string into map[string]string If found it will convert that to a map with all keys uppercased and any characters
Types ¶
type Generator ¶
Generator is the main struct holding the strategy patterns iface any initialised config if overridded with withers as well as the final outString and the initial rawMap which wil be passed in a loop into a goroutine to perform the relevant strategy network calls to the config store implementations
func New ¶
New returns a new instance of Generator with a default strategy pattern wil be overwritten during the first run of a found tokens map
func (*Generator) Config ¶
func (c *Generator) Config() *config.GenVarsConfig
Config gets Config on the GenVars
func (*Generator) DiscoverTokens ¶
func (c *Generator) DiscoverTokens(text string) (NormalizedTokenSafe, error)
DiscoverToken generates a k/v map of the tokens with their corresponding secret/paramstore values the standard pattern of a token should follow a path like string
Called only from a slice of tokens
func (*Generator) Generate ¶
func (c *Generator) Generate(tokens []string) (ReplacedToken, error)
Generate generates a k/v map of the tokens with their corresponding secret/paramstore values the standard pattern of a token should follow a path like string
Called only from a slice of tokens
func (*Generator) NormalizeRawToken ¶
func (c *Generator) NormalizeRawToken(rtm *RawTokenConfig) NormalizedTokenSafe
func (*Generator) WithConfig ¶
func (c *Generator) WithConfig(cfg *config.GenVarsConfig) *Generator
WithConfig uses custom config
func (*Generator) WithStores ¶
WithStores assigns additional stores to the strategy
Adds addtional funcs for storageRetrieval used for testing only
type NormalizedToken ¶
type NormalizedToken struct {
// contains filtered or unexported fields
}
NormalizedToken represents the struct after all the possible tokens were merged into the lowest commmon denominator. The idea is to minimize the number of networks calls to the underlying `store` Implementations
The merging is based on the implemenentation and sanitized token being the same, if the token contains metadata then it must be stored uniquely even if the underlying store is the same. This is because a token with metadata must be called uniquely as it may contain different versions of the same token - hence the value would be different
Merging strategy ¶
Same Prefix + Same SanitisedToken && No Metadata
func (*NormalizedToken) WithParsedToken ¶
func (n *NormalizedToken) WithParsedToken(v *config.ParsedTokenConfig) *NormalizedToken
type NormalizedTokenSafe ¶
type NormalizedTokenSafe struct {
// contains filtered or unexported fields
}
NormalizedTokenSafe is the map of lowest common denominators by token.Keypathless or token.String (full token) if metadata is included
func (NormalizedTokenSafe) GetMap ¶
func (n NormalizedTokenSafe) GetMap() map[string]*NormalizedToken
func (NormalizedTokenSafe) TokenSet ¶
func (n NormalizedTokenSafe) TokenSet() []string
type RawTokenConfig ¶
type RawTokenConfig struct {
// contains filtered or unexported fields
}
RawTokenConfig represents the map of discovered tokens via the lexer/parser
func NewRawTokenConfig ¶
func NewRawTokenConfig() *RawTokenConfig
func (*RawTokenConfig) AddToken ¶
func (rtm *RawTokenConfig) AddToken(name string, parsedToken *config.ParsedTokenConfig)
func (*RawTokenConfig) RawTokenMap ¶
func (rtm *RawTokenConfig) RawTokenMap() map[string]*config.ParsedTokenConfig
type ReplacedToken ¶
ReplacedToken is the internal working object definition and the return type if results are not flushed to file
func (ReplacedToken) MapKeys ¶
func (pm ReplacedToken) MapKeys() (keys []string)
type TokenResponse ¶
type TokenResponse struct {
Err error
// contains filtered or unexported fields
}
func (*TokenResponse) Key ¶
func (tr *TokenResponse) Key() *config.ParsedTokenConfig
func (*TokenResponse) Value ¶
func (tr *TokenResponse) Value() string
func (*TokenResponse) WithKey ¶
func (tr *TokenResponse) WithKey(key *config.ParsedTokenConfig)
func (*TokenResponse) WithValue ¶
func (tr *TokenResponse) WithValue(val string)