score

package
v0.1.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 24, 2026 License: MIT Imports: 12 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func ComputeMaxParams

func ComputeMaxParams(fa *lang.FileAnalysis) int

ComputeMaxParams returns the maximum parameter count of any function/method in the file. High parameter counts indicate complex interfaces that are harder for AI agents to modify safely.

func SizeGrade

func SizeGrade(lines int) string

SizeGrade returns a grade based on line count thresholds from production experience.

Types

type AmbiguityResult

type AmbiguityResult struct {
	GrepNoise int `json:"grep_noise"`
}

AmbiguityResult is the per-file grep noise measurement. Grep noise = sum of (definition_count - 1) for each name this file defines that is also defined in other files. Measures how much search noise this file's definitions create across the codebase.

func ComputeAmbiguity

func ComputeAmbiguity(fa *lang.FileAnalysis, idx nameIndex) AmbiguityResult

ComputeAmbiguity computes grep noise for a single file given the global name index. Grep noise combines two sources:

  • Definition noise: ambiguous names this file DEFINES (other files also define them)
  • Reference noise: ambiguous names this file IMPORTS (it will need to disambiguate)

Combined noise correlates at r=+0.521 vs r=+0.430 for definition noise alone (validated on Undercity, 8,817 sessions).

type AmbiguousName

type AmbiguousName struct {
	Name  string           `json:"name"`
	Count int              `json:"count"`
	Sites []DefinitionSite `json:"sites"`
}

AmbiguousName is a name defined in multiple files.

func CollectAmbiguousNames

func CollectAmbiguousNames(idx nameIndex, threshold int) []AmbiguousName

CollectAmbiguousNames returns all names defined in 2+ files of the same language.

type BlastRadius

type BlastRadius struct {
	ImportedByCount int            `json:"imported_by_count"`
	ImportedBy      []string       `json:"imported_by,omitempty"`
	MostExported    []ExportedName `json:"most_exported,omitempty"`
}

BlastRadius measures how many files could break if this file is edited.

func ComputeBlastRadius

func ComputeBlastRadius(path string, analyses map[string]*lang.FileAnalysis, ncIdx nameConsumerIndex) BlastRadius

ComputeBlastRadius computes how many files import from this file.

type CommentStats

type CommentStats struct {
	CommentLines int     `json:"comment_lines"`
	CodeLines    int     `json:"code_lines"`
	Density      float64 `json:"density"` // comment_lines / total_lines (0.0-1.0)
}

CommentStats describes comment density in a file.

func ComputeCommentStats

func ComputeCommentStats(fa *lang.FileAnalysis, src []byte) CommentStats

ComputeCommentStats counts comment vs code lines. Uses simple heuristics per language — not tree-sitter, just line scanning.

type ContextReads

type ContextReads struct {
	Total       int                 `json:"total"`
	Unnecessary int                 `json:"unnecessary"`
	Relocatable []RelocatableImport `json:"relocatable,omitempty"`
}

ContextReads measures how many files the AI must read to understand this file.

func ComputeContextReads

func ComputeContextReads(fa *lang.FileAnalysis, consumers importConsumerMap, ctx localImportContext) ContextReads

ComputeContextReads computes the context reads metric for a file.

type DefinitionSite

type DefinitionSite struct {
	File      string `json:"file"`
	Line      int    `json:"line"`
	Qualified string `json:"qualified_name"`
}

DefinitionSite is one location where a name is defined.

type DiffResult

type DiffResult struct {
	Version      string     `json:"version"`
	Schema       int        `json:"schema"`
	Ref          string     `json:"ref"`
	FilesChanged int        `json:"files_changed"`
	Regressions  int        `json:"regressions_count"`
	Files        []FileDiff `json:"files"`
}

DiffResult is the output of adit score --diff.

type ExportedName

type ExportedName struct {
	Name      string `json:"name"`
	Consumers int    `json:"consumers"`
}

ExportedName tracks how many files consume a particular exported name.

type FileDiff

type FileDiff struct {
	Path        string       `json:"path"`
	Status      string       `json:"status"` // "modified" | "added" | "deleted"
	Before      *FileScore   `json:"before,omitempty"`
	After       *FileScore   `json:"after,omitempty"`
	Regressions []Regression `json:"regressions,omitempty"`
}

FileDiff compares a file's metrics between a ref and HEAD.

type FileScore

type FileScore struct {
	Path            string          `json:"path"`
	Lines           int             `json:"lines"`
	SizeGrade       string          `json:"size_grade"`
	MaxNestingDepth int             `json:"max_nesting_depth"`
	NodeDiversity   int             `json:"node_diversity"`
	MaxParams       int             `json:"max_params"`
	Functions       FunctionStats   `json:"functions"`
	ContextReads    ContextReads    `json:"context_reads"`
	Ambiguity       AmbiguityResult `json:"ambiguity"`
	Comments        CommentStats    `json:"comments"`
	Graph           GraphMetrics    `json:"graph"`
	BlastRadius     BlastRadius     `json:"blast_radius"`
}

FileScore is the complete analysis result for a single file.

type FunctionStats

type FunctionStats struct {
	Count     int `json:"count"`
	MaxLength int `json:"max_length"` // longest function in lines
	AvgLength int `json:"avg_length"` // average function length in lines
}

FunctionStats describes the distribution of function/method lengths in a file.

func ComputeFunctionStats

func ComputeFunctionStats(fa *lang.FileAnalysis) FunctionStats

ComputeFunctionStats calculates function/method length distribution for a file.

type GraphMetrics

type GraphMetrics struct {
	TransitiveDepth int `json:"transitive_depth"` // longest import chain from this file
	TransitiveCount int `json:"transitive_count"` // unique files reachable via imports
}

GraphMetrics describes a file's position in the dependency graph.

func ComputeGraphMetrics

func ComputeGraphMetrics(path string, graph importGraph) GraphMetrics

ComputeGraphMetrics computes transitive dependency metrics for a file.

type ImportCycle

type ImportCycle struct {
	Files          []string `json:"files"`
	Length         int      `json:"length"`
	Recommendation string   `json:"recommendation"`
}

ImportCycle is a circular dependency chain between files.

func DetectCycles

func DetectCycles(analyses map[string]*lang.FileAnalysis, localCtx localImportContext) []ImportCycle

DetectCycles finds import cycles using DFS on the import graph.

type Pipeline

type Pipeline struct {
	// contains filtered or unexported fields
}

Pipeline orchestrates the two-pass analysis.

func NewPipeline

func NewPipeline(frontends []lang.Frontend, cfg config.Config) *Pipeline

NewPipeline creates a pipeline with the given frontends and config.

func (*Pipeline) ScoreFile

func (p *Pipeline) ScoreFile(path string) (*FileScore, error)

ScoreFile scores a single file (still requires scanning siblings for cross-file metrics).

func (*Pipeline) ScoreRepo

func (p *Pipeline) ScoreRepo(paths []string) (*RepoScore, error)

ScoreRepo scores all files under the given paths.

func (*Pipeline) ScoreRepoDiff

func (p *Pipeline) ScoreRepoDiff(paths []string, ref string) (*DiffResult, error)

ScoreRepoDiff scores changed files between a git ref and HEAD, reporting regressions.

type Regression

type Regression struct {
	Metric string `json:"metric"`
	Before int    `json:"before"`
	After  int    `json:"after"`
	Delta  int    `json:"delta"` // positive = worse
}

Regression is a metric that got worse between two versions.

type RelocatableImport

type RelocatableImport struct {
	Name     string `json:"name"`
	Kind     string `json:"kind"`      // "constant" | "function" | "type" | "unknown"
	From     string `json:"from"`      // source file path
	FromLine int    `json:"from_line"` // line number in source file
	To       string `json:"to"`        // destination (consumer) file path
	Reason   string `json:"reason"`    // e.g. "single consumer"
}

RelocatableImport is a single-consumer import that should be co-located with its only consumer file.

type RepoScore

type RepoScore struct {
	Version      string      `json:"version"`
	Schema       int         `json:"schema"`
	FilesScanned int         `json:"files_scanned"`
	Files        []FileScore `json:"files"`
	Summary      RepoSummary `json:"summary"`
}

RepoScore is the top-level output of adit score.

type RepoSummary

type RepoSummary struct {
	Relocatable    []RelocatableImport `json:"relocatable,omitempty"`
	AmbiguousNames []AmbiguousName     `json:"ambiguous_names,omitempty"`
	Cycles         []ImportCycle       `json:"cycles,omitempty"`
	HighBlast      []FileScore         `json:"high_blast_radius,omitempty"`
}

RepoSummary aggregates cross-file findings.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL