What is Local AI?
If you're comfortable with concepts like language models, quantization, and running models locally, feel free to skip ahead to Getting Started.
This guide introduces the fundamentals of running AI on your own hardware. By the end, you'll understand why local AI matters and how Docket makes it accessible.
The Basics: What is a Language Model?
A language model (often called an LLM, or Large Language Model) is software trained on vast amounts of text to understand and generate human language. When you ask ChatGPT a question or have Claude write an email, you're using a language model.
These models work by predicting what text should come next, but they've become sophisticated enough to:
- Answer questions and explain complex topics
- Write and debug code
- Summarize documents
- Translate languages
- Assist with creative writing
- Analyze images (with vision-capable models)
Cloud vs. Local
Most AI services run in the cloud. Your prompts are sent to remote servers, processed, and sent back. This works well but has trade-offs:
| Cloud AI | Local AI |
|---|---|
| Requires internet connection | Works completely offline |
| Data sent to third-party servers | Data never leaves your device |
| Subscription costs | One-time hardware cost |
| Access to largest models | Limited by your hardware |
| Provider controls everything | You control everything |
Local AI means running the model directly on your computer (or in Docket's case, from a USB drive). Your prompts never leave your machine.
Why Run AI Locally?
Privacy and Security
When you use cloud AI, your conversations pass through third-party servers. For sensitive work like legal documents, medical information, proprietary code, or personal matters, this may not be acceptable.
With local AI:
- Your data stays on your device — nothing is transmitted
- No logging by providers — conversations aren't stored on remote servers
- Air-gapped operation — works in secure facilities with no network access
Docket takes this further with AES-256 encryption, protecting your conversations even if the drive is lost or stolen.
Reliability and Access
Cloud services can go down, change their terms, or discontinue access. Local AI:
- Works without internet — essential for remote locations, travel, or emergencies
- No rate limits — use it as much as you want
- No censorship changes — the model behaves consistently
- Always available — no outages or maintenance windows
Cost
Cloud AI typically requires ongoing subscriptions ($20+/month for premium access). Local AI has upfront costs (hardware) but no recurring fees. For heavy users, local AI often pays for itself quickly.
Portability
With Docket, your entire AI workstation fits in your pocket. The USB drive contains everything you need: models, conversations, files, and settings. Plug it into any compatible computer and pick up right where you left off.
This means you can:
- Use any computer — your laptop, a friend's PC, a library computer
- Travel light — no need to set up AI on every machine you use
- Keep everything together — models, chats, and files in one place
Applications of Local AI
Local AI excels in scenarios where privacy, offline access, or cost matter:
| Use Case | Why Local Works |
|---|---|
| Software Development | Keep proprietary code private while getting AI assistance |
| Legal & Medical | Handle sensitive client/patient information safely |
| Research | Analyze confidential data without exposure |
| Remote Work | Use AI where internet is unreliable or unavailable |
| Emergency Prep | Have AI assistance when infrastructure fails |
| Education | Learn and experiment without usage limits |
| Creative Writing | Write freely without content being logged |
Docket includes presets optimized for many of these scenarios, from coding to emergency survival.
Limitations to Understand
Local AI has trade-offs worth knowing:
Hardware Requirements
Models run on your CPU and RAM (or GPU if available). Larger, more capable models need more resources. A model that runs smoothly on a gaming PC might struggle on an older laptop.
Docket's pre-installed models are selected to run well on typical hardware (8GB RAM recommended), but you can also download larger models if your machine supports them.
See Machine Specs for guidance on what your hardware can handle.
Model Size vs. Capability
Cloud services like GPT-4 or Claude use massive models (hundreds of billions of parameters) running on specialized hardware. Local models are smaller to fit consumer hardware.
This means local models may be:
- Less capable at complex reasoning
- More prone to errors on specialized topics
- Less fluent in languages other than English
That said, local models have improved dramatically. For most practical tasks like coding, writing, Q&A, and summarization, modern 7B-14B parameter models perform remarkably well.
No Internet Features
Local models can't:
- Browse the web or access current information
- Connect to external APIs (unless you enable network access)
- Access real-time data
They only know what was in their training data. Docket lets you toggle network access when you need online features.
How Docket Helps
Docket removes the technical barriers to local AI:
- Pre-configured — Models are already installed and optimized
- Portable — Runs from a USB drive on any compatible computer
- No installation — No Python, no dependencies, no command line
- Encrypted — Your data is protected automatically
- Model browser — Download additional models with a few clicks
Instead of spending hours setting up inference engines, downloading model files, and configuring parameters, you just plug in and start.
Next Steps
Now that you understand what local AI is and why it matters:
- Models & Quantization — Learn how models work and which to choose
- Benefits & Trade-offs — Deeper dive into what local AI can and can't do