ChatGPT, Claude, Gemini – we all use AI assistants in our daily lives now. But if you work with sensitive data or simply value your privacy, you might ask yourself: Do I really need to send my texts, code snippets, and documents to the cloud?
The answer: No. With OllamaDeploy, I've developed an open-source tool that automates the installation of a complete AI system on Mac – in less than 10 minutes.
What is OllamaDeploy?
OllamaDeploy is an installation script that brings together three components:
- Ollama – the foundation platform for local Large Language Models (LLMs)
- Docker Desktop – for container isolation
- Open WebUI – a ChatGPT-like web interface
After installation, a complete AI system runs locally on your Mac. No internet connection required, no data leaves your machine.
Why Local?
The advantages of a local AI installation are numerous:
- Privacy – Confidential documents, code, and personal data stay on your own device
- No subscription costs – Once installed, there are no monthly fees
- Offline operation – Perfect for travel or environments without internet access
- Flexibility – Multiple AI models can be used in parallel, depending on the task
Which Models are Available?
OllamaDeploy supports all models from the Ollama catalog:
- LLaMA 3.1/3.3 (Meta) – The all-rounder for most tasks
- DeepSeek-R1 – Particularly strong for programming tasks
- Gemma 3 (Google) – Compact and fast
- Mistral – Good balance of speed and quality
System Requirements
OllamaDeploy is optimized for Macs with Apple Silicon:
- Mac with M1, M2, M3, or M4 processor
- At least 16 GB RAM (32 GB recommended)
- 50 GB free storage space
- macOS 11 Big Sur or newer
Installation in 3 Steps
The installation is intentionally simple:
# 1. Download and extract ZIP
# Download: https://juergenkoller.software/downloads/OllamaDeploy.zip
cd OllamaDeploy
# 2. Start the interactive menu
./start.sh
# 3. Select "Fully Automatic Installation"
After about 10 minutes, the web interface opens at http://localhost:3000.
Who is OllamaDeploy For?
The tool is aimed at:
- Developers – Code reviews, debugging, documentation
- Content Creators – Writing texts, brainstorming, translations
- Businesses – Processing sensitive data locally
- Private users – Using AI without data sharing
Download
OllamaDeploy is available as a free download. The project is based on proven open-source components (Ollama, Open WebUI) and can be freely used.
Download OllamaDeploy (ZIP, 46 KB)
Based on: heise.de – Mac als lokales KI-System: So geht's
Conclusion
If you want to use AI without sending your data to cloud services, OllamaDeploy offers a straightforward solution. The installation is automated, the web interface is intuitive, and the model selection covers most use cases.
Give it a try and let me know your feedback – via the contact form or email.