Nectar - STING for Desktop

Nectar

Nectar

Private AI on Your Desktop

Download, install, start chatting with your documents. Everything runs locally on your machine.


Download Nectar Alpha

Alpha Release - Nectar is in active development. Expect bugs. Report issues on GitHub.

macOS

macOS 11 (Big Sur) or later
Intel & Apple Silicon

Download .dmg

Windows

Windows 10 or later
64-bit

Download .exe

Linux

Ubuntu 20.04+, Debian 11+
64-bit

Storage Note: The installer is small, but Nectar pulls Docker containers on first run (~1.5GB). With Ollama and AI models, expect 5-10GB total depending on which models you use.


What is Nectar?

Nectar is the desktop app for STING. It gives you a native application for managing your knowledge and chatting with your documents—all running locally on your machine.

Local AI

Runs with Ollama. Your documents and conversations never leave your machine. No cloud, no subscriptions, no data sharing.

Honey Jars

Organize your knowledge into encrypted, searchable collections. PDFs, Word docs, Markdown—throw it all in and search with AI.

Bee AI

Chat with your documents. Ask questions, get answers with citations. Bee AI learns your content and speaks your language.

Works Offline

Once installed, Nectar works without internet. Perfect for sensitive work, travel, or just avoiding distractions.

Connect to STING Server

Optionally connect to a STING Server to access shared team knowledge while keeping a native desktop experience.

Passkey Auth

Secure login with Touch ID, Face ID, Windows Hello, or hardware keys. No passwords to remember or leak.


Screenshots


Requirements

ComponentMinimumRecommended
RAM8GB16GB+
Storage10GB free20GB+ SSD
CPUDual-coreQuad-core+
GPUNot requiredHelps with larger models

Nectar runs Docker containers locally. First launch will download ~1.5GB of container images. AI models via Ollama add 2-8GB each depending on model size.

Prerequisites

Docker - Nectar runs STING services in Docker containers. Install Docker first:

Ollama - For local AI models. Install after Docker:

macOS / Linux:

curl -fsSL https://ollama.com/install.sh | sh

Ollama will download AI models as needed when you first use Nectar.


Installation

  1. Download the .dmg file
  2. Open the downloaded file
  3. Drag Nectar to your Applications folder
  4. First launch: Right-click → Open (to bypass Gatekeeper for unsigned apps)
  5. Grant permissions when prompted (accessibility, files)

Note: Nectar is currently unsigned. macOS will warn you on first launch. This is normal for alpha software.

  1. Download the .exe installer
  2. Run the installer
  3. Windows SmartScreen may warn about an unrecognized app—click "More info" → "Run anyway"
  4. Follow the installation wizard
  5. Launch Nectar from the Start menu

Note: Nectar is currently unsigned. Windows will show a SmartScreen warning. This is normal for alpha software.

Debian/Ubuntu (.deb):

sudo dpkg -i Nectar.deb
sudo apt-get install -f  # Install dependencies if needed

AppImage:

chmod +x Nectar.AppImage
./Nectar.AppImage

Note: You may need to install additional dependencies for WebKit on some distributions.


Quick Start

  1. Install Docker (see prerequisites above)
  2. Install Ollama for local AI models
  3. Download and install Nectar for your platform
  4. Launch Nectar - Docker containers will start automatically on first run
  5. Create a Honey Jar and add some documents
  6. Start chatting with Bee AI about your documents

First launch takes a few minutes while containers download. After that, startup is fast.


FAQ

Everything stays on your machine:

  • macOS: ~/Library/Application Support/Nectar
  • Windows: %APPDATA%\Nectar
  • Linux: ~/.local/share/nectar

No cloud sync, no telemetry, no data leaving your device.

Coming soon. We're actively working on STING Server sync, which will let you:

  • Access shared team Honey Jars
  • Use server-side AI models
  • Keep your local work while accessing team knowledge

For now, Nectar works fully standalone. Check the roadmap for updates.

Local models via Ollama:

  • Qwen 2.5 - Good all-around model
  • Phi-4 - Microsoft's efficient reasoning model
  • Llama 3.3 - Meta's latest
  • Any model Ollama supports

Cloud providers (optional):

  • OpenAI - GPT-4o, GPT-4, etc.
  • Anthropic - Claude 3.5 Sonnet, Claude 3 Opus
  • Any OpenAI-compatible API endpoint

PII Protection: When using cloud providers, Nectar's PII layer automatically detects and scrambles sensitive data (SSNs, credit cards, phone numbers, etc.) before it leaves your machine. Responses are de-scrambled automatically. Your sensitive data never reaches the cloud.

Yes. Nectar is free to use with no subscriptions, no usage limits, and no feature gates. You own your data completely.

The Nectar source code is maintained in a private repository, but the application itself is free to download and use.

Download from GitHub Releases

This is alpha software—bugs are expected!

Include your OS, Nectar version, and steps to reproduce.


Need the Server Version?

Nectar is for personal use on a single machine. If you need multi-user access or team features, check out STING Server.


Stay Updated

Nectar is in active development. Star the repo to get notified of new releases.

Star on GitHub