Privacy Policy

Your privacy is important to us. This policy explains how we collect, use, and protect your information.

We don't look at your code or conversations and we don't want to. Your work stays on your machine with your locally installed Ollama.

We do collect analytics data like what features you use, login patterns, and error logs. This helps us fix bugs and improve the parts of LocalCode that matter most to developers.

Where We Store Your Data

Your conversations and code stay completely on your computer with your Ollama installation. When you use LocalCode, everything runs locally on your machine. Your prompts, model responses, your code, your files — all of it lives in your local environment and never leaves your computer to reach our servers.

This means your sensitive code, proprietary projects, personal conversations, debugging sessions, and any other work you do stays private and under your control. We literally can't see it even if we wanted to.

Here's what we do store elsewhere:

In our encrypted database we keep your account info like email address and GitHub integration details if you connect them.

We store analytics in PostHog when you use features like creating GitHub workspaces, switching between projects, or when errors happen. This includes which features you're using and error messages when things break.

We don't record session replays or screenshots.

Your AI Workloads

Inference runs locally via Ollama on your machine. We never intercept, proxy, or store any of your conversations.

Model Policies

Since everything runs locally with Ollama, no third‑party AI provider receives your data. Individual models may have their own licenses — review model terms in the Ollama registry.

Data Security

Any data we store is encrypted and only accessible by our team. We use SOC 2 compliant services like Supabase and PostHog for maximum security.

Questions? Contact us at hi@LocalCode.dev

Last updated: 11/5/2025