2 AI
Matt Reeves edited this page 2024-05-29 04:17:40 +00:00

AI

Description/Reasoning

Runs my Ollama and Open-webui containers. Ollama is the easiest way to run LLM's that I've came across

Hardware

  • Custom Built
  • SBC (Raspberry PI, ZimaBoard, etc.)
  • Proxmox Node

Type

  • Physical
  • Virtual

If virtual, which Proxmox Node

  • Node 1
  • Node 2

Processor

  • Cores: 12 core host

Memory

  • Capacity: 32GB

Storage

  • Main Drive:
    • Type: VirtIO SCSI
    • Capacity: 160GB

Operating System

  • Type: Ubuntu Server
  • Version: 24.04

Notes & Tips