Skip to main content

Pre-installed Models

Docket comes with 7 AI models ready to use out of the box.

Model Overview

ModelParametersBest ForRAM Required
Llama 3.21BQuick tasks, low resources4 GB
Gemma 3N3BGeneral conversation6 GB
Qwen 2.5 Uncensored7BUnrestricted responses6 GB
Qwen 2.5 VL Vision7BImage understanding8 GB
Qwen 2.5 Coder7BProgramming6 GB
Gemma 29BBalanced performance8 GB
Qwen 2.514BComplex reasoning12+ GB

Model Details

Llama 3.2 (1B)

Best for: Quick responses and low-resource environments

  • Fastest response times
  • Lowest memory requirements
  • Good for simple questions and tasks
  • May struggle with complex reasoning

Gemma 3N

Best for: General purpose conversations

  • Balanced speed and quality
  • Good context understanding
  • Suitable for most everyday tasks
  • Moderate resource usage

Qwen 2.5 Uncensored

Best for: Unrestricted responses

  • Fewer content restrictions
  • Honest and direct responses
  • Good for creative writing
  • Same capabilities as standard Qwen 2.5

Qwen 2.5 VL Vision

Best for: Understanding images

  • Can analyze images you upload
  • Describe what's in photos
  • Read text from images
  • Answer questions about visual content

Usage:

  1. Load the model
  2. Click the attachment icon in chat
  3. Upload an image
  4. Ask questions about the image

Qwen 2.5 Coder

Best for: Programming assistance

  • Optimized for code generation
  • Understands many programming languages
  • Good at debugging and explaining code
  • Can write tests and documentation

Supported languages:

  • Python, JavaScript, TypeScript
  • Rust, Go, C/C++
  • HTML, CSS, SQL
  • And many more

Gemma 2

Best for: Balanced performance

  • High quality responses
  • Good reasoning capabilities
  • Versatile for different tasks
  • Moderate memory requirements

Qwen 2.5 (14B)

Best for: Complex reasoning tasks

  • Largest pre-installed model
  • Best quality responses
  • Handles complex instructions
  • Requires more RAM and time
tip

Start with smaller models and move up if you need more capability. Larger models don't always produce better results for simple tasks.

Loading Models

  1. Open the Models section in the sidebar
  2. Click on a model to select it
  3. Click "Load" to load it into memory
  4. Wait for the status to show "Ready"

Unloading Models

Models stay in memory until you:

  • Load a different model
  • Close the application
  • Manually unload via the Models page

Only one model can be loaded at a time for local models.