Product Solutions Security Pricing Dashboard Log In Contact Sales →

Zero-trust security.
By architecture, not policy.

No data leaves your server unless you say so. Every layer of Dockbox is built so that security is structural -- not a checkbox you hope someone remembered to tick.

Single-tenant deployment
PII scrubbing before AI
Container-isolated sessions
Zero credential exposure

Your server. Nobody else's.

Most AI tools share infrastructure across customers. A vulnerability in one tenant can cascade to all. Dockbox is single-tenant by design -- every deployment is isolated compute, storage, and network.

Shared Cloud (Other AI Tools)

Multi-tenant SaaS

Customer A → ┌────────────────┐
Customer B → │ Shared Server   │
Customer C → │ Shared Database │
Customer D → └────────────────┘
  • Data commingled with other orgs
  • Breach in one tenant risks all
  • No control over data residency
  • Vendor sees all your data
Dedicated Server (Dockbox)

Single-tenant deployment

Your Org  → ┌────────────────┐
            │ Your Server    │
            │ Your Database  │
            │ Your Network  │
            └────────────────┘
  • Complete physical isolation
  • Your firewall, your rules
  • Choose your data center region
  • No third party ever touches your data

Two-pass PII scrubbing pipeline

Before any document reaches a cloud AI model, it passes through a two-stage scrubbing system that runs entirely on your server. Originals are preserved in a secure vault with role-based access.

📄

Document

Original file with sensitive data

🔎

Pattern Match

Regex + dictionary rules

🤖

AI Review

Local Ollama model catches context-dependent PII

🔒

Vault

Originals stored safely with restore access

☁️

Cloud AI

Only sees anonymized, scrubbed data

Pass 1: Regex + Dictionary

The first pass uses pattern matching to catch structured PII: social security numbers, credit card numbers, phone numbers, email addresses, and custom patterns you define. Dictionary-based matching catches names and known entities from your organization's data.

Pass 2: Local AI Model

A local Ollama model (never leaves your server) performs contextual analysis to catch PII that regex misses -- indirect identifiers, contextual references, and sensitive information that only natural language understanding can detect.

Secure Vault

Original files are preserved at data/pii-vault/ with full audit trail. Authorized users can restore originals through the dashboard. The vault is never exposed to AI models or external services.

Role-Based Restore

Only administrators can access the vault and restore scrubbed files to their original state. Every restore action is logged. Group-level scrubbing ensures each team can only access their own vault entries.

Every conversation in a sealed container

Each AI session runs inside its own Docker container with isolated filesystem, memory, and network. When the session ends, the workspace is destroyed. Nothing persists that you did not explicitly save.

Container: engineering /workspace/group → groups/engineering/
/workspace/ipc → data/ipc/engineering/
Memory: isolated · Network: restricted
Container: marketing /workspace/group → groups/marketing/
/workspace/ipc → data/ipc/marketing/
Memory: isolated · Network: restricted
Container: finance /workspace/group → groups/finance/
/workspace/ipc → data/ipc/finance/
Memory: isolated · Network: restricted

True process isolation

Dockbox does not use threads or sandboxed processes. Every AI session runs in a full Docker container -- the same isolation technology used by cloud providers to separate customers.

  • Isolated filesystem per group -- no cross-contamination
  • Separate memory space prevents data leaks between sessions
  • Configurable timeout kills runaway containers automatically
  • Concurrency queue prevents resource exhaustion

Containers never see real API keys

The credential proxy sits between containers and external APIs. Containers make requests with placeholder keys -- the proxy intercepts and injects real credentials. Even a fully compromised container cannot exfiltrate your secrets.

Request flow
Docker Container ANTHROPIC_API_KEY=placeholder
Credential Proxy :3001 Strips placeholder, injects real key
Anthropic API Authenticated request
Container env only contains ANTHROPIC_BASE_URL=http://host:3001

Defense in depth

Even if an AI model were convinced to inspect its own environment variables, it would find nothing useful. The real credentials live only on the host, injected at the proxy layer.

  • API keys never enter container environment
  • Proxy runs on host network, inaccessible from container filesystem
  • Credential rotation requires zero container changes
  • Works with any AI provider -- swap keys without touching agents

Mount security and access control

Containers see only allowlisted paths. Sensitive files are shadowed. Project code is read-only. A strict authorization matrix controls what each group can do.

# Container mount configuration
/workspace/group     → groups/{folder}        [read-write]
/workspace/global    → groups/global           [read-only]
/workspace/project   → PROJECT_ROOT            [read-only]
/workspace/ipc       → data/ipc/{group}/       [read-write]

# Sensitive files are never exposed
.env/dev/null               [shadowed]
credentials.jsonblocked by allowlist    [denied]

# Additional mounts validated against allowlist
# Path traversal attempts (../) rejected at validation layer
# Sensitive directories (/etc, /root, ~/.ssh) always blocked
Action Main Group Non-Main Group
Send messages to own chat Allowed Allowed
Send messages to other chats Allowed Denied
Schedule tasks for self Allowed Allowed
Schedule tasks for other groups Allowed Denied
Register new groups Allowed Denied
View all groups Allowed Denied
Scrub files Own group only Own group only
Access project source Read-only No access

No vendor lock-in. Ever.

Self-hosted means you own everything. Your data, your infrastructure, your choice of AI provider. Walk away from any vendor at any time with zero data migration.

🔄

Swap AI Providers

Switch between Anthropic, OpenAI, or any compatible API by changing a single environment variable. The credential proxy handles the rest.

🏠

Run Local Models

Use Ollama or any local inference server for zero API cost. Your PII scrubber already runs locally -- extend that to all AI processing when needed.

💻

Full Source Access

Every line of code is available for audit, modification, and extension. No black boxes. No proprietary runtimes. Standard Node.js and Docker.

📦

Standard Formats

SQLite database, JSON configs, standard Docker containers. Export your data anytime in formats that any tool can read.

🌐

Your Data Center

Run on AWS, GCP, Azure, bare metal, or a Raspberry Pi. Dockbox runs anywhere Docker runs -- the choice is always yours.

🛠

Extend and Customize

Add channels, integrations, and AI tools through the MCP server interface. The plugin architecture means you never hit a wall.

Built for regulated industries

Dockbox's architecture aligns with major compliance frameworks out of the box. Single-tenant deployment and PII scrubbing give you the controls auditors need to see.

FERPA

Education

Student data never leaves your institution's server. PII scrubbing ensures that even AI-processed content contains no personally identifiable student records.

  • Student PII automatically scrubbed
  • Data remains on-premises
  • Audit trail for all AI interactions
  • Role-based access for faculty and staff
HIPAA

Healthcare

Protected health information stays within your controlled environment. Container isolation ensures PHI cannot leak between departments or sessions.

  • PHI scrubbed before cloud processing
  • Container isolation as access control
  • Encrypted storage on your infrastructure
  • No BAA required -- data never leaves
SOC 2

Enterprise

Dockbox's architecture implements SOC 2 trust service criteria patterns: logical access controls, system boundaries, and change management through code.

  • Defined system boundaries via containers
  • Logical access through mount allowlists
  • Infrastructure as code for change control
  • Comprehensive activity logging
Data Residency

Sovereignty

Deploy in any jurisdiction. Since you control the server, you control where data physically resides. Meet GDPR, PIPEDA, or any regional data residency requirement.

  • Choose your data center location
  • No cross-border data transfers
  • Local AI processing with Ollama
  • Full data deletion on your schedule

Security that doesn't slow you down

Deploy Dockbox on your infrastructure in under an hour. Zero-trust security comes standard -- no enterprise add-ons, no premium tiers.