SectorFlux

Monitor, Debug, Optimize Your Local AI

A lightweight, high-performance flight recorder for local LLM agents. See every request, measure every token, debug every issue.

Open source under GPL-3.0 | Single binary, no dependencies

Everything You Need

Powerful features for debugging and optimizing your local AI agents

🔄

Real-time Streaming Proxy

Transparent forwarding with live token streaming. Your apps work exactly as before.

📝

Request/Response Logging

Complete history of all LLM interactions. Never lose a conversation again.

📊

Performance Metrics

Token counts, latency, TPS, and time-to-first-token. Know exactly how your AI performs.

💾

Smart Caching

Optional response caching to reduce redundant API calls and speed up development.

💬

Chat Playground

Test prompts directly from the dashboard. No need for external tools.

📦

Single Binary (~10MB)

No dependencies, no Node.js, no Python. Just download and run.

Up and Running in Seconds

Just change your port from 11434 to 8888

Before (Direct to Ollama)
# Your existing code
client = OpenAI(
    base_url="http://localhost:11434/v1",
    api_key="ollama"
)
After (Through SectorFlux)
# Just change the port!
client = OpenAI(
    base_url="http://localhost:8888/v1",
    api_key="ollama"
)

That's it. Your dashboard is now live at http://localhost:8888

See It In Action

A beautiful, intuitive dashboard for all your AI monitoring needs

Dashboard

SectorFlux Dashboard

Real-time metrics and request history at a glance

Chat Playground

SectorFlux Chat Playground

Test your prompts directly from the interface

COMING SOON

SectorFlux Pro

Advanced analytics, team collaboration, and enterprise features.

📈 Advanced Analytics 👥 Team Collaboration 🏢 Enterprise Features 🔧 Priority Support

Join the Community

Be the first to know about updates and Pro launch