localcloud

module
v0.1.2 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jul 2, 2025 License: Apache-2.0

README ยถ

LocalCloud - Ship AI Products Before Your Coffee Gets Cold

License Go Version Docker Required

LocalCloud transforms your laptop into a complete AI development environment. Run GPT-level models, vector databases, and production infrastructure locally with zero cloud costs. No API keys, no usage limits, no data leaving your machine.

๐Ÿšฆ Quick Start

Installation

macOS

Homebrew (Recommended):

brew install localcloud-sh/tap/localcloud

Manual Installation:

Apple Silicon:

curl -L https://github.com/localcloud-sh/localcloud/releases/latest/download/localcloud-darwin-arm64.tar.gz | tar xz
sudo install -m 755 localcloud-darwin-arm64 /usr/local/bin/localcloud
sudo ln -sf /usr/local/bin/localcloud /usr/local/bin/lc

Intel:

curl -L https://github.com/localcloud-sh/localcloud/releases/latest/download/localcloud-darwin-amd64.tar.gz | tar xz
sudo install -m 755 localcloud-darwin-amd64 /usr/local/bin/localcloud
sudo ln -sf /usr/local/bin/localcloud /usr/local/bin/lc
Linux

Homebrew (if installed):

brew install localcloud-sh/tap/localcloud

Manual Installation:

Show installation commands

AMD64:

curl -L https://github.com/localcloud-sh/localcloud/releases/latest/download/localcloud-linux-amd64.tar.gz | tar xz
sudo install -m 755 localcloud-linux-amd64 /usr/local/bin/localcloud
sudo ln -sf /usr/local/bin/localcloud /usr/local/bin/lc

ARM64:

curl -L https://github.com/localcloud-sh/localcloud/releases/latest/download/localcloud-linux-arm64.tar.gz | tar xz
sudo install -m 755 localcloud-linux-arm64 /usr/local/bin/localcloud
sudo ln -sf /usr/local/bin/localcloud /usr/local/bin/lc
Windows (Under Testing)
Show Windows installation (experimental)
  1. Download the latest release from GitHub Releases
  2. Extract localcloud-windows-amd64.zip
  3. Add the extracted folder to your system PATH
  4. Restart your terminal
  5. Use localcloud or lc commands

Note: Windows support is experimental. WSL2 is recommended for better compatibility.

Quick Install Script

For macOS and Linux, you can also use our install script:

curl -fsSL https://raw.githubusercontent.com/localcloud-sh/localcloud/main/scripts/install.sh | bash

This script will:

  • Detect your OS and architecture
  • Use Homebrew if available (macOS)
  • Download the appropriate binary
  • Install LocalCloud and create the lc alias

Getting Started

# Initialize a new project
lc init my-assistant
cd my-assistant

# Configure your services (interactive setup)
lc setup

You'll see an interactive wizard:

? What would you like to build? (Use arrow keys)
โฏ Chat Assistant - Conversational AI with memory
  RAG System - Document Q&A with vector search  
  Speech Processing - Whisper + TTS
  Custom - Select components manually

? Select components you need: (Press <space> to select, <enter> to confirm)
โฏ โ—ฏ [AI] LLM (Text generation) - Large language models for text generation
  โ—ฏ [AI] Embeddings (Semantic search) - Text embeddings for similarity
  โ—ฏ [AI] Speech-to-Text (Whisper) - Convert speech to text
  โ—ฏ [Database] Vector Database (pgvector) - PostgreSQL with pgvector
  โ—ฏ [Infrastructure] Cache (Redis) - In-memory cache for sessions
  โ—ฏ [Infrastructure] Queue (Redis) - Job queue for background processing
  โ—ฏ [Infrastructure] Object Storage (MinIO) - S3-compatible storage
# Start all selected services
lc start

# Your AI services are now running!
# Check status: lc status

Note: lc is the short alias for localcloud - use whichever you prefer!

โœจ Key Features

  • ๐Ÿš€ One-Command Setup: Get started in seconds with lc setup
  • ๐Ÿ’ฐ Zero Cloud Costs: Everything runs locally - no API fees or usage limits
  • ๐Ÿ”’ Complete Privacy: Your data never leaves your machine
  • ๐Ÿ“ฆ Pre-built Templates: Production-ready backends for common AI use cases
  • ๐Ÿง  Optimized Models: Carefully selected models that run on 4GB RAM
  • ๐Ÿ”ง Developer Friendly: Simple CLI, clear errors, extensible architecture
  • ๐Ÿณ Docker-Based: Consistent environment across all platforms
  • ๐ŸŒ Mobile Ready: Built-in tunnel support for demos anywhere

๐ŸŽฏ Vision

Make AI development as simple as running a local web server.

LocalCloud eliminates the complexity and cost of AI development by providing a complete, local-first development environment. No cloud bills, no data privacy concerns, no complex configurations - just pure development productivity.

๐Ÿ’ก Perfect For These Scenarios

๐Ÿข Enterprise POCs Without The Red Tape

Waiting 3 weeks for cloud access approval? Your POC could be done by then. LocalCloud lets you build and demonstrate AI solutions immediately, no IT tickets required.

๐Ÿ“ฑ Mobile Demos That Actually Work

Present from your phone to any client's screen. Built-in tunneling means you can demo your AI app from anywhere - coffee shop WiFi, client office, or conference room.

๐Ÿ’ธ The $2,000 Cloud Bill You Forgot About

We've all been there - spun up a demo, showed the client, forgot to tear it down. With LocalCloud, closing your laptop is shutting down the infrastructure.

๐ŸŽ“ Turn Lecture Recordings into Study Notes

Got 50 hours of lecture recordings? LocalCloud + Whisper can transcribe them all for free. Add RAG and you've got an AI study buddy that knows your entire course.

๐Ÿ” When "Cloud-First" Meets "Compliance-First"

Healthcare, finance, government? Some data can't leave the building. LocalCloud keeps everything local while giving you cloud-level capabilities.

๐Ÿš€ Hackathon Secret Weapon

No API rate limits. No usage caps. No waiting for credits. Just pure development speed when every minute counts.

๐Ÿ’ผ Build Commercial Products Without Burning Cash

Your Own Cursor/Copilot: Build an AI code editor without $10k/month in API costs during development.
AI Mobile Apps: Develop and test your AI-powered iOS/Android app locally until you're ready to monetize.
SaaS MVP: Validate your AI startup idea without cloud bills - switch to cloud only after getting paying customers.

๐ŸŽฏ Technical Interview Assignments That Shine

For Employers: Give candidates a pre-configured LocalCloud environment. No setup headaches, just coding skills evaluation.
For Candidates: Submit a fully-working AI application. While others struggle with API keys, you ship a complete solution.

๐Ÿ› ๏ธ Internal Tools That Would Never Get Budget

AI Customer Support Trainer: Process your support tickets locally to train a custom assistant.
Code Review Bot: Build a team-specific code reviewer without sending code to external APIs.
Meeting Transcription System: Record, transcribe, and summarize meetings - all on company hardware.

๐Ÿ“š Available Templates

During lc setup, you can choose from pre-configured templates or customize your own service selection:

1. Chat Assistant with Memory

lc init my-assistant
lc setup  # Select "Chat Assistant" template
  • Conversational AI with persistent memory
  • PostgreSQL for conversation storage
  • Streaming responses
  • Model switching support

2. RAG System (Retrieval-Augmented Generation)

lc init my-knowledge-base
lc setup  # Select "RAG System" template
  • Document ingestion and embedding
  • Vector search with pgvector
  • Context-aware responses
  • Scalable to millions of documents

3. Speech-to-Text with Whisper

lc init my-transcriber
lc setup  # Select "Speech/Whisper" template
  • Audio transcription API
  • Multiple language support
  • Real-time processing
  • Optimized Whisper models

4. Custom Selection

lc init my-project
lc setup  # Choose "Custom" and select individual services
  • Pick only the services you need
  • Configure each service individually
  • Optimal resource usage

Note: MVP includes backend infrastructure only. Frontend applications coming in v2.

๐Ÿ—๏ธ Architecture

LocalCloud Project Structure:
โ”œโ”€โ”€ .localcloud/          # Project configuration
โ”‚   โ””โ”€โ”€ config.yaml       # Service configurations
โ”œโ”€โ”€ docker-compose.yml    # Generated service definitions
โ””โ”€โ”€ .env                  # Environment variables

Setup Flow

  1. Initialize: lc init creates project structure
  2. Configure: lc setup opens interactive wizard
    • Choose template or custom services
    • Select AI models
    • Configure ports and resources
  3. Start: lc start launches all services
  4. Develop: Services are ready for your application

Core Services

Service Description Default Port
AI/LLM Ollama with selected models 11434
Database PostgreSQL with pgvector 5432
Cache Redis for performance 6379
Queue Redis for job processing 6380
Storage MinIO (S3-compatible) 9000/9001

AI Components

Component Service Use Case
Whisper Speech-to-Text Audio transcription
Piper Text-to-Speech Voice synthesis
Stable Diffusion Image Generation AI images
Qdrant Vector Database Similarity search

๐Ÿ› ๏ธ System Requirements

  • OS: macOS, Linux, Windows (WSL2)
  • RAM: 4GB minimum (8GB recommended)
  • Disk: 10GB free space
  • Docker: Docker Desktop or Docker Engine
  • CPU: x64 or ARM64 processor

Note: LocalCloud is written in Go for performance, but you don't need Go installed. The CLI is a single binary that works everywhere.

๐ŸŽฎ Command Reference

Project Commands

# Initialize new project
lc init [project-name]

# Interactive setup wizard
lc setup

# Start all services
lc start

# Stop all services
lc stop

# View service status
lc status

# View logs
lc logs [service]

Service Management

# Start specific service
lc service start postgres
lc service start whisper

# Stop specific service
lc service stop postgres

# Restart service
lc service restart ai

# Get service URL
lc service url postgres

Database Commands

# Connect to database
lc db connect

# Backup database
lc db backup

# Restore from backup
lc db restore backup-file.sql

# Run migrations
lc db migrate

Model Management

# List available models
lc models list

# Pull a new model
lc models pull llama3.2:3b

# Remove a model
lc models remove llama3.2:3b

# Show model information
lc models info qwen2.5:3b

๐Ÿ”ง Configuration

LocalCloud uses a simple YAML configuration:

# .localcloud/config.yaml
project:
   name: my-assistant
   version: 1.0.0

models:
   default: qwen2.5:3b
   embeddings: nomic-embed-text

services:
   ai:
      memory_limit: 2GB
      gpu: false

   database:
      port: 5432
      extensions:
         - pgvector

๐Ÿ› Troubleshooting

Docker Not Running

# Check Docker status
docker info

# macOS/Windows: Start Docker Desktop
# Linux: sudo systemctl start docker

Port Already in Use

# Find process using port
lsof -i :3000

# Use different port
lc start --port 3001

Model Download Issues

# Check available space
df -h

# Clear unused models
lc models prune

Database Connection Failed

# Check if PostgreSQL is running
lc status postgres

# View PostgreSQL logs
lc logs postgres

# Restart PostgreSQL
lc service restart postgres

๐Ÿค Contributing

We welcome contributions! See CONTRIBUTING.md for:

  • Development setup
  • Code style guidelines
  • Testing requirements
  • Pull request process

๐Ÿ“– Documentation

๐Ÿ—บ๏ธ Roadmap

โœ… Phase 1 - MVP (Current)

  • Core CLI with lc alias
  • Interactive setup wizard
  • Docker service orchestration
  • PostgreSQL with pgvector
  • Model management
  • Service templates
  • Service health monitoring

๐Ÿšง Phase 2 - Application Layer

  • Backend code templates
  • LocalCloud SDK
  • API scaffolding
  • Migration system

๐Ÿ”ฎ Phase 3 - Frontend & Advanced

  • Next.js frontend templates
  • Web-based admin panel
  • Mobile app support
  • Model fine-tuning
  • Kubernetes support

๐Ÿ“„ License

Apache License 2.0 - see LICENSE for details.

Why Apache 2.0?

  • โœ… Enterprise-friendly - Your legal team will actually approve it
  • โœ… Patent protection - Explicit patent grants protect everyone
  • โœ… Commercial use - Build products, sell services, no restrictions
  • โœ… Modification - Fork it, change it, make it yours
  • โœ… Attribution - Just keep the license notice

Perfect for both startups building products and enterprises needing compliance.

๐Ÿ™ Acknowledgments

LocalCloud is built on the shoulders of giants:

  • Ollama for local model serving
  • PostgreSQL for reliable data storage
  • Docker for containerization
  • All the open-source AI models making this possible

LocalCloud - AI development at zero cost, infinite possibilities
Website โ€ข GitHub โ€ข Discord โ€ข Twitter

Directories ยถ

Path Synopsis
cmd
localcloud command
internal
cli
Package cli implements the command-line interface for LocalCloud
Package cli implements the command-line interface for LocalCloud
components
internal/components/registry.go Package components provides component registry and management for LocalCloud
internal/components/registry.go Package components provides component registry and management for LocalCloud
config
internal/config/config.go
internal/config/config.go
diagnostics
internal/diagnostics/debugger.go
internal/diagnostics/debugger.go
docker
internal/docker/client.go
internal/docker/client.go
logging
internal/logging/aggregator.go
internal/logging/aggregator.go
models
Package models provides AI model management functionality
Package models provides AI model management functionality
monitoring
internal/monitoring/collector.go
internal/monitoring/collector.go
network
internal/network/connectivity.go
internal/network/connectivity.go
recovery
internal/recovery/health_monitor.go
internal/recovery/health_monitor.go
services
internal/services/manager.go
internal/services/manager.go
services/postgres
internal/services/postgres/client.go
internal/services/postgres/client.go
services/vectordb
internal/services/vectordb/interface.go
internal/services/vectordb/interface.go
services/vectordb/providers/pgvector
internal/services/vectordb/providers/pgvector/pgvector.go
internal/services/vectordb/providers/pgvector/pgvector.go

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL