Skip to content

**Shepherd** is a powerful and elegant web application designed to be the ultimate companion for your [Ollama](https://ollama.com/) server.

License

Notifications You must be signed in to change notification settings

mddanish/Shepherd

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Shepherd: Advanced Ollama Manager

Shepherd Logo

A premium, self-hosted web interface for managing your Ollama models and server.


🚀 Overview

Shepherd is a powerful and elegant web application designed to be the ultimate companion for your Ollama server. Built with a focus on usability and aesthetics, it provides a comprehensive suite of tools to manage your local LLM infrastructure. Whether you are a developer testing new models, a researcher organizing a library, or an enthusiast exploring local AI, Shepherd gives you full control through a sleek, dark-mode interface.

✨ Key Features

📊 Interactive Dashboard

  • Real-time Monitoring: Visualize server status and key metrics at a glance.
  • Library Overview: Track your total installed models and their aggregate size.
  • Quick Actions: Access your most frequently used tools directly from the home screen.
Dashboard

Tip

View Full Screenshot Gallery - Take a visual tour of all features including Settings, Model Factory, and more.

📦 Comprehensive Model Management

  • One-Click Installation: Search and pull models directly from the official Ollama library.
  • Detailed Metadata: Inspect technical details including parameter size (7B, 70B), quantization level (Q4_K_M, FP16), and model family.
  • Smart Filters: Easily sort and filter your library to find the right model for the job.
  • Deep Inspection: View the raw Modelfile, prompt templates, system parameters, and license information with syntax highlighting.
  • Copy & Backup: Duplicated models with a single click to create experimental branches or backups (my-model-v1 -> my-model-v2).
Installed Models

🧪 Advanced Model Factory

  • Custom Model Creation: Build new models on top of existing ones without touching a terminal.
  • Prompt Engineering: Define custom system prompts and prompt templates.
  • Fine-Grained Control: Adjust inference parameters such as:
    • Temperature: Control creativity vs. determinism.
    • Context Window: Set custom context lengths (e.g., 8192, 32k).
    • Stop Sequences: Define custom stop tokens.
    • Seeds & Sampling: Set specific seeds for reproducibility.
Model Factory

🧬 Embeddings & Utilities

  • Embeddings Factory: Create specialized embedding models tailored to your RAG pipelines.
  • File-to-Vector: Upload text files (.txt, .md, .json) and instantly generate vector embeddings. The result is a downloadable JSON file ready for your vector database.
  • Integrated Chat Playground: Test any installed model instantly in a streaming chat interface to verify behavior and prompt effectiveness.
Embeddings Factory

⚙️ Enterprise-Ready

  • Multi-Server Support: Manage multiple Ollama instances (local, remote, or cloud-hosted) from a single interface.
  • Responsive Design: fully responsive UI that works on desktop, tablet, and mobile.

🛠️ Installation & Setup

Prerequisites

  1. Ollama: You must have an Ollama server running.
  2. Python 3.10+: Ensure you have a compatible Python version installed.

Quick Start (Manual)

  1. Clone the Repository

    # Clone to your preferred location (e.g., /opt/shepherd or ~/shepherd)
    git clone https://git.mamber.in/Personal/Shepherd.git /opt/shepherd
    cd /opt/shepherd
  2. Create a Virtual Environment

    python3 -m venv venv
    source venv/bin/activate
  3. Install Dependencies

    pip install -r requirements.txt
  4. Launch the Application

    uvicorn backend.main:app --host 0.0.0.0 --port 8000
  5. Shepherd is Live! Open your browser and navigate to: http://localhost:8000


Running as a Systemd Service (Recommended for Servers)

For production or always-on deployments, run Shepherd as a systemd service.

1. Create a Dedicated User (Optional but Recommended)

sudo useradd -r -s /bin/false shepherd
sudo chown -R shepherd:shepherd /opt/shepherd

2. Configure the Service File

A sample service file is provided at shepherd.service. You must edit it to match your setup:

Setting Default Value What to Change
User shepherd The Linux user that will run the service
Group shepherd The group for the service user
WorkingDirectory /opt/shepherd The path where you cloned the repository
ExecStart /opt/shepherd/venv/bin/uvicorn ... Update the path if you cloned elsewhere

Example: If you cloned to /home/myuser/apps/shepherd and want to run as myuser:

User=myuser
Group=myuser
WorkingDirectory=/home/myuser/apps/shepherd
ExecStart=/home/myuser/apps/shepherd/venv/bin/uvicorn backend.main:app --host 0.0.0.0 --port 8000

3. Install and Enable the Service

# Copy the service file to systemd
sudo cp shepherd.service /etc/systemd/system/

# Reload systemd to recognize the new service
sudo systemctl daemon-reload

# Enable the service to start on boot
sudo systemctl enable shepherd

# Start the service now
sudo systemctl start shepherd

4. Verify It's Running

# Check the service status
sudo systemctl status shepherd

# View live logs
sudo journalctl -u shepherd -f

5. Managing the Service

sudo systemctl stop shepherd      # Stop the service
sudo systemctl restart shepherd   # Restart after config changes
sudo systemctl disable shepherd   # Prevent auto-start on boot

📚 Ollama Resources

Shepherd is built to empower your experience with Ollama. Here are essential resources for the ecosystem:

Resource Description
Ollama Website The official home of the project. Download the runner here.
Model Library Browse thousands of community models (Llama 3, Mistral, Gemma, Phi-3).
Documentation Deep dive into installation, API, and customization.
Modelfile Guide Learn how to craft custom Modelfiles for your specific needs.
API Reference Technical details for the REST API that powers Shepherd.

🤝 Contributing

We welcome contributions from the community! Whether it's adding a new feature, fixing a bug, or improving documentation, your help is appreciated.

  1. Fork the repository.
  2. Create your feature branch (git checkout -b feature/AmazingFeature).
  3. Commit your changes (git commit -m 'Add some AmazingFeature').
  4. Push to the branch (git push origin feature/AmazingFeature).
  5. Open a Pull Request.

📄 License

Shepherd is open-source software licensed under the GNU General Public License v3.0.

About

**Shepherd** is a powerful and elegant web application designed to be the ultimate companion for your [Ollama](https://ollama.com/) server.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published