What STING Actually Does
Powerful AI. Your servers. Your rules.
Yeah, you can run an LLM locally. But can you stop it from memorizing your client's social security numbers? Can you package your team's knowledge into something searchable? Can you keep everything local while still getting useful answers? That's where we come in.
The TL;DR
Honey Jars: Your Knowledge, Bottled
All that tribal knowledge trapped in people's heads and random SharePoint folders? Put it somewhere useful. Honey Jars are encrypted containers that make your expertise searchable without exposing the raw data.
- Build unlimited knowledge bases from docs, processes, whatever you've got
- Fine-grained permissions—you decide who sees what
- Export everything—your data is always yours
- AES-256 encryption at rest
Your Servers, Your Rules
Run it on-prem. Run it air-gapped. Run it behind seventeen firewalls if that makes you happy. We don't judge. We also don't phone home.
- Your data never leaves unless you tell it to
- Cloud AI is optional—use it for non-sensitive stuff if you want
- Air-gap friendly for the truly paranoid (or government-mandated)
- Works with your existing security setup
- No telemetry, no tracking, no "anonymous usage data"
Bee AI Actually Learns Your Business
Generic AI gives generic answers. Bee AI reads your stuff and talks like someone who's actually worked there. Novel concept, we know.
- Learns from your specific documents
- Picks up your industry jargon
- Answers based on YOUR data, not "based on my training data from 2023"
- Remembers conversation context
Open Source, Apache 2.0 Licensed
No vendor lock-in. No surprise licensing changes. No "we got acquired and now everything costs money." Fork it, modify it, run it forever.
- Full source code on GitHub
- Apache 2.0 license with patent protection
- Community-driven development
- No feature gates or artificial limits
PII Protection That Works
Sensitive data, caught automatically. No manual redaction. No "oops, we trained on your SSNs." The data gets scrambled before the AI ever touches it.
- PII types detected automatically
- Real-time scrubbing before AI processing
- Passkey authentication for admin operations and sensitive access
- Visual indicators show what's being protected
Bee AI: Not Your Average Chatbot
We've all used chatbots that make you want to throw your laptop. Bee AI actually reads your docs, learns your lingo, and gives answers that make sense for YOUR business.
- Contextual understanding across conversations
- Learns from your documents via RAG
- Works with local LLMs (Ollama, LM Studio)
- Works at 3am when you're debugging production
Multiple LLM Support
Use whatever model works for you. Run local with Ollama, connect to OpenAI, or use both depending on the task.
- Ollama for fully local operation
- OpenAI-compatible API support
- Anthropic support
- Model selection per conversation
WebAuthn/Passkey Auth
Modern authentication without the password headaches. Biometrics, hardware keys, whatever works for you.
- Passwordless login
- Touch ID, Face ID, Windows Hello
- Hardware key support (YubiKey, etc.)
- Powered by Ory Kratos
Flexible Deployment
Desktop app for individuals. Docker for teams. VM for IT departments that like things pre-packaged.
- Nectar desktop app (Mac, Windows, Linux)
- Docker Compose deployment
- Air-gap friendly—no internet required after install
Document Processing
Throw documents at it. PDFs, Word docs, Markdown, text files. STING extracts the content and makes it searchable.
- PDF extraction
- DOCX support
- Markdown files
- Plain text
What It Looks Like

Bee AI in Action
Actual conversations with your documents.

Building Honey Jars
Packaging knowledge into searchable collections.

The Dashboard
What's happening at a glance.
Ready to Try It?
Download Nectar and be chatting with your documents once the containers pull. Or deploy STING CE Server and have your whole team running in about 30 minutes.