Page:
AI
Pages
AI
AdGuard Home Sync
AdGuard Home
Arr's
Bazarr
Broken Down Table
Cloudflare Tunnels
Codeberg Pages
Doplarr
FAQ
FlareSolverr
Forgejo Runer
Forgejo
Gotify
Homarr
Home
Jellyfin
Lidarr
LinkStack
MakeMKV
Netboot.xyz
Nginx Proxy Manager
Ollama
Portainer
Prowlarr
Radarr
Sabnzbd
Sonarr
Sonatype Nexus
Syncthing
Twingate Connectors
Uptime Kuma
agbackup
agmain
arm
authentik
cf
downloaders
jellyfin (host)
jfa-go
kasm
mindsdb
n8n
netboot
open-webui
pages
qBittorrentVPN
No results
2
AI
Matt Reeves edited this page 2024-05-29 04:17:40 +00:00
AI
Description/Reasoning
Runs my Ollama and Open-webui containers. Ollama is the easiest way to run LLM's that I've came across
Hardware
- Custom Built
- SBC (Raspberry PI, ZimaBoard, etc.)
- Proxmox Node
Type
- Physical
- Virtual
If virtual, which Proxmox Node
- Node 1
- Node 2
Processor
- Cores: 12 core host
Memory
- Capacity: 32GB
Storage
- Main Drive:
- Type: VirtIO SCSI
- Capacity: 160GB
Operating System
- Type: Ubuntu Server
- Version: 24.04