BELULLAMA

A powerful stand-alone AI application bundle!

Run Llama 3, Mistral, Gemma, and other large language models locally. Generate images with Stable Diffusion.

Install with one command:

curl -s https://raw.githubusercontent.com/ai-joe-git/Belullama/main/belullama_installer.sh | sudo bash

🧪 Beta Testers Needed 🧪

As we're in the final stages of development, we're looking for beta testers to help us ensure the NVIDIA version works flawlessly across different setups. If you have an NVIDIA GPU and would like to contribute to the project by being a beta tester, please try the GPU supported version:

Install GPU version with one command:

curl -s https://raw.githubusercontent.com/ai-joe-git/Belullama/main/belullama_installer_gpu.sh | sudo bash

Features

  • All-in-One AI Platform: Integrates Ollama, Open WebUI, and Automatic1111 (Stable Diffusion WebUI)
  • Easy Setup: Simple installer script for quick deployment
  • Conversational AI: Create and manage chatbots and AI applications
  • Image Generation: Use Stable Diffusion models through Automatic1111 WebUI
  • User-Friendly Interface: Intuitive Open WebUI for interacting with language models
  • Offline Operation: Run entirely offline for data privacy and security
  • Extensibility: Customize and extend functionalities as needed

Sign up to get notified of new updates.