lx: Controllable AI for Gophers!
lx restores the joy of wrangling code by letting you define the forest, while AI plants the trees under your intentional command.
Table of Contents
Philosophy
Some argue that Vibe Coding is just a natural evolution like the shift from punch cards to Assembly, and then to C.
I fundamentally disagree.
Historical abstractions were about placing a better hammer in the developer's hand.
The tools evolved, but the wielder remained human. The outcome was always under our absolute control.
Today's Vibe Coding is different.
The robot has taken the hammer. We’ve been sidelined, left to do nothing but beg the machine for the right output.
If we do not wield the hammer, we are no longer makers.
If it is not under our control, it's not programming anymore.
That is why I built lx.
lx puts the hammer back in the Gopher's hand, restoring absolute control.
It allows you to command AI strictly on your terms through a new paradigm: LLM Functionalization.
It all works in your editor, on your machine, under your terms.
Feature
Developer Sovereignty
The function signature is the contract. lx treats your function names, parameters, and return types as an immutable contract.
AI cannot change your architecture; it can only fulfill the implementation details you demand.
Deterministic AI Coding
lx doesn't just read your code; it runs it.
If a function is not reached during execution, lx touches nothing.
It strictly adheres to the Go runtime path. No hallucinations for dead code, no assumptions about unused functions.
Furthermore, if your code fails to compile or crashes (panic) during the capture phase, lx immediately halts.
It refuses to generate code on a broken foundation.
Continuous Development
With lx, once you write a function’s inputs and outputs, you can keep moving as if the logic is already there.
But it’s not “design-only” — your program still has to compile and run.
You drive the flow and the contract; lx uses runtime traces to fill in the function internals.
Environment Independence
Code for any target, from any host.
With Build Tags support, lx bridges the gap between your development machine and your production environment.
Quick Start
You only need three things:
- Input: function params (real runtime values)
- Intent:
lx.Gen(...) (your instruction)
- Output: return type + dummy return (the expected shape)
Step 1: Write the contract
// test.go
package main
import (
"fmt"
"github.com/chebread/lx"
)
func main() {
// You own the flow.
fmt.Println(LX_Greeter("Gopher", true))
}
func LX_Greeter(name string, isMorning bool) string {
// Intent: describe the logic. (captured at runtime)
lx.Gen(fmt.Sprintf("Greet %s politely. Is it morning? %v", name, isMorning))
// Output: dummy return that matches the real output shape.
return "Dummy Greeting String"
}
Step 2: Run lx
lx only generates code for functions that are actually reached during execution.
If LX_Greeter is never called in main(), lx will assume it is dead code and skip it.
lx [flags...] [PATH]
PATH can be . (project root), a relative path, or an absolute path.
- If omitted,
lx defaults to the current directory (.).
Step 3: Review the result
func LX_Greeter(name string, isMorning bool) string {
// lx-prompt: Greet Gopher politely. Is it morning? true
// lx-dep: fmt
if isMorning {
return fmt.Sprintf("Good morning, %s! Hope you have a productive day.", name)
}
return fmt.Sprintf("Good evening, %s. Time to rest.", name)
}
Usage
One Function, One Task
LLM Functionalization means mapping one function to one AI task.
lx replaces the entire body of any function containing lx.Gen.
Do not mix orchestration logic and AI intent in the same function.
Bad example
The AI will overwrite your main logic, and MyWorker() will never be called.
func main() {
lx.Gen("Print hello") // <--- This takes over the whole function!
MyWorker() // <--- This will be DELETED.
}
Good example
Keep your control flow (Orchestrator) separate from AI tasks (Functional Unit).
func main() {
// Orchestrator: You own the flow.
PrintHello()
MyWorker()
}
func PrintHello() {
// Functional Unit: AI owns the implementation.
lx.Gen("Print hello")
}
Capture Mode (LX_MODE=capture)
During the capture run, lx sets:
This is intentional: it lets you explicitly control side effects during capture.
If your program does anything risky (network calls, DB writes, file operations), guard it:
if os.Getenv("LX_MODE") == "capture" {
// capture run: keep it safe and quiet
// (no network calls, no DB writes, no destructive operations)
}
lx executes your code to capture runtime data. However, if your code depends on hardware-specific packages (like machine in TinyGo), platform-specific APIs (Windows syscalls), or heavy external libraries not present on your local machine, the capture phase will fail to compile.
You can solve this using Go's native Build Tags to "mock" these dependencies.
Step 1: Split your logic
Create a mock version of your platform-dependent code.
hardware_real.go (Targeting the actual device)
//go:build !lx_mock
package main
import "machine" // TinyGo specific package
func InitHardware() {
machine.GP15.Configure(machine.PinConfig{Mode: machine.PinInputPullup})
}
hardware_mock.go (Targeting your local PC for lx capture)
//go:build lx_mock
package main
func InitHardware() {
// No-op or log for local capture
}
Tell lx to include your mock implementation during the capture run. This allows lx to bypass non-existent packages on your Mac/Linux and successfully capture the data flow.
# You can use any tag name (e.g., mock, dev, capture)
lx -tags lx_mock .
Why this is powerful:
- Universal Capture: Generate code for a $2 Pico or a Windows Server while working on a MacBook.
- Environment Independence: Run
lx without setting up complex databases or CGO dependencies by mocking them during the capture phase.
- Pure Logic Focus: By isolating "noisy" infrastructure, the AI focuses 100% on synthesizing your core business logic.
What lx [PATH] does
- Injects spies into target functions (wraps returns with
lx.Spy(...))
- Runs
go run . inside the target directory to capture real inputs and expected outputs
- Reverts the injected spies immediately (restores your source files)
- Generates and patches the function bodies based on the captured evidence
Useful flags (optional)
-timeout=2m: stop capture if your program doesn’t exit
-show-stdout=true: show your program’s stdout (trace lines excluded)
-max-prompt, -max-context, -max-output: bound what gets sent to the LLM
Dependency Management
lx respects the Single Responsibility Principle(SRP). It generates code, but it doesn't silently install packages.
If the AI uses a new library (e.g., github.com/google/uuid), lx will:
- Add
// lx-dep: ... comments in the code.
- Report any issues to the terminal.
Freedom of Editor
VS Code? Vim? Zed? Cursor? It doesn't matter.
lx is a CLI tool. It works where you work.
Hierarchical Configuration
lx follows strict precedence:
Local config (project) overrides Global config (home).
It adapts to your workspace, not the other way around.
Installation & Config
To use lx, you need both the CLI tool and the Go library. Here is the straightforward setup for any Gopher.
1. Install the CLI
lx is deployed via Homebrew for macOS.
brew tap chebread/lx
brew install lx
[!NOTE]
Currently, only macOS (Intel/Apple Silicon) is supported.
2. Add the Library to your Project
Add the dependency to your project to use lx.Gen() in your code.
go get github.com/chebread/lx
3. Configuration
Create an lx-config.yaml file in your home directory (~/) or project root.
lx supports two modes: Direct API and Universal Command.
Option A: Direct API (Google Gemini)
The simplest setup if you have a Google API Key.
provider: "gemini"
api_key: "YOUR_API_KEY"
model: "gemini-2.0-flash"
Option B: Universal CLI (Gemini, Claude, Ollama, etc.)
lx can wrap any CLI tool installed on your machine.
Use the command provider and define the argument template. lx will automatically substitute {{prompt}} and {{model}} at runtime.
1. Google Gemini CLI (Zero Cost / No API Key in Config)
provider: "command"
bin_path: "/usr/local/bin/gemini" # Check with `which gemini`
model: "gemini-2.0-flash"
args:
- "-p"
- "{{prompt}}"
- "-m"
- "{{model}}"
- "-o"
- "text"
2. Claude Code (Anthropic)
provider: "command"
bin_path: "/usr/local/bin/claude"
model: "claude-3-7-sonnet"
args:
- "-p"
- "{{prompt}}"
- "--model"
- "{{model}}"
3. Ollama (Local / Offline / Free)
Run Llama 3 or DeepSeek locally without internet.
provider: "command"
bin_path: "/usr/local/bin/ollama"
model: "llama3"
args:
- "run"
- "{{model}}"
- "{{prompt}}"
License
This project is licensed under the AGPL-3.0 License.