Deploy on Mac
Security
This script exposes many ports and services on your system. It is recommended that you put a firewall in front of your server to only allow your IP address to access the server or run this on a private network.
Prerequisites
- MacOS >12.0.1
- Docker Desktop for Mac
- M1 or better processor
- Mac Integrated GPU
- At least 32GB of RAM
- At least 100GB of free disk space
- Port 80 and 443 open on your machine (if you want to use the proxy)
Install Docker and Docker Compose
Install Docker Desktop for Mac: https://docs.docker.com/desktop/install/mac-install/
Using Your Own TLS Certificates
If you intend on installing with a domain name and using the proxy, there are 2 options.
Option 1 - Automatic Self Signed
Let the script generate a self-signed certificate for you. This is the easiest option and is recommended for most users.
Option 2 - User Provided
Generate your own valid TLS certificates and use them with the proxy.
You will need to create a file called cert-bundle.pem
with the private.key
at the top followed by the fullchain.pem
file in a single file. Make sure all headers and footers are included like -----BEGIN-----
and -----END-----
.
Once you have created the cert-bundle.pem
file, you need to place it in the ./deploy/llamanator-bash/services/llamanator/haproxy/user-provided-certs
directory.
Once that is done, run the install script with the --install-proxy
option as shown below.
Check out the Generating Certs page for more information on how to generate your own certificates.
Install Llamanator Bash on MacOS
- Install Ollama on your Mac: https://ollama.com
- Set Ollama to be exposed on your machine by running
launchctl setenv OLLAMA_HOST "0.0.0.0"
and then restarting Ollama - Clone this repo (if you haven't already):
git clone https://github.com/llamanator-project/llamanator.git
- Change directory to the Llamanator Bash directory:
cd llamanator/deploy/llamanator-bash
- Copy the
.env.example
file to.env
:cp .env.example .env
- Edit the
.env
file to enable the services you want to run. Please review the .env instructions below for the required options. - Run the install script:
- To install with the Proxy on 80/443 run:
sudo ./install.sh --install-proxy
- To install without the Proxy run:
sudo ./install.sh
- To install with the Proxy on 80/443 run:
- Once complete, open the
./llamanator-links.txt
file to access your services
NOTE ABOUT MACOS: In some services, the 127.0.0.1
and localhost
may not work. You may need to use the IP address of your machine to access the services.
Uninstall Llamanator Bash
There are 2 options to remove the Llamanator project from your machine:
- Just stop the services and keep the data:
sudo ./uninstall.sh
- Stop the services and remove all data:
sudo ./uninstall.sh --remove-all-data
Restarting Llamanator Bash
Regardless of the uninstall option you choose, you can restart the Llamanator project by running the ./install.sh
script again. If the volumes are still present, the data will be re-attached to the services. If you removed the data, the services will start fresh.
Note about Ollama
Linux
The Ollama data is stored locally on the disk so you can prevent having to download the LLMs again. The data is stored on the host filesystem in the ollama_data
directory. If you want to remove the Ollama data, you can delete the ./services/ollama/ollama_data
directory.
MacOS
The Ollama data is stored ~/.ollama/models
directory. If you want to remove all Ollama data, you can delete the ~/.ollama/models
directory.
If you only want to remove specific LLMs, you can run ollama rm <model_name>
to remove the LLM from the Ollama service.
Cleaning Up Docker
If you want to clean up Docker, you can run the docker system prune -a
command. This will remove all stopped containers, all networks not used by at least one container, all dangling images, and all build cache.