Setup and install Ollama with Open WebUI on Ubuntu 24.04 & Docker. Self Hosted LLM AI Platform in the cloud on Azure, AWS or Google GCP. Integrate with OpenAI compatible APIs. The ultimate ChatGPT User Interface.
Self Host Ollama and OpenWebUI
Setup Ollama + OpenWebUI on Azure
Deploy Ollama + OpenWebUI on Ubuntu 24.04 in Azure
Setup Ollama + OpenWebUI on Azure
Deploy Ollama + Open WebUI on Ubuntu 24.04 in AWS
Setup Ollama + OpenWebUI on Azure
Deploy Ollama & OpenWebUI on Ubuntu 24.04 in GCP
Getting Started with Ollama and Open WebUI
Once your Ollama server has been deployed, the following links explain how to connect to a Linux VM:
Once connected and logged in, the following section explains how to start using Ollama with Open WebUI.
Ollama runs locally on port 11434 and Open WebUI runs as a local Docker container and uses port 8080.
Installing Ollama LLM Models
Once logged in via ssh, the first step is to decide which Ollama LLMs learning language models you would like to install. For example to install Meta Llama 3: The most capable openly available LLM to date, run the following command:
ollama run llama3
To exit out of an LLM window, and return to your terminal press Ctrl + D.
To list your installed LLMs run the following command:
ollama list
Login to Open WebUI Interface
Open WebUI runs as a Docker container. To check the status of the Open WebUI container, run the following command:
sudo docker ps
It may take a few minutes for the container to fully power up.
Once you’ve installed your LLMs, you can now use Open WebUI as your ChatGPT interface. Login to the following URL:
http://youripaddress:8080
On the login screen, you first need to create a username and password by selecting Sign_Up.
Once logged in, you can select your installed model from the drop down menu and you’re ready to start using the chat window. The speed will depend on the size of the VM you’ve selected during your deployment. You can upgrade your VM size if you want to improve the speed.
Ollama and Open WebUI Firewall Ports
Ollama runs locally on the following port:
TCP 11434 (It runs on http://127.0.0.1:11434)
Open WebUI runs on a local Docker container on the following port:
TCP 8080
The links below explain how to modify / create firewall rules depending on which cloud platform you are using.
Disclaimer: Ollama is licensed under MIT license. No warrantee of any kind, express or implied, is included with this software. Use at your risk, responsibility for damages (if any) to anyone resulting from the use of this software rest entirely with the user. The author is not responsible for any damage that its use could cause.
Cloud Solution Architect. Helping customers transform their business to the cloud. 20 years experience working in complex infrastructure environments and a Microsoft Certified Solutions Expert on everything Cloud.
00votes
Article Rating
Subscribe
Login and comment with
I allow to create an account
When you login first time using a Social Login button, we collect your account public profile information shared by Social Login provider, based on your privacy settings. We also get your email address to automatically create an account for you in our website. Once your account is created, you'll be logged-in to this account.
DisagreeAgree
Login and comment with
I allow to create an account
When you login first time using a Social Login button, we collect your account public profile information shared by Social Login provider, based on your privacy settings. We also get your email address to automatically create an account for you in our website. Once your account is created, you'll be logged-in to this account.