Linux Out of the Box – Running Local AI

In this Linux Out of the Box episode, I shift focus from gaming and creative tools to something bigger: AI on the desktop. Using Project Bluefin, I explore how to set up and run local large language models (LLMs) alongside popular cloud AI

· 27m 13s · 979 views

In this Linux Out of the Box episode, I shift focus from gaming and creative tools to something bigger: AI on the desktop. Using Project Bluefin, I explore how to set up and run local large language models (LLMs) alongside popular cloud AI platforms — all without endless subscriptions.

Test System Specs:
• AMD Ryzen 9 processor
• NVIDIA RTX 5060 Ti GPU
• 32GB RAM
• 1TB SSD

🔹 What’s Covered in This Video
• AI Hype vs Reality
• Why AI is more like the next industrial revolution than job-killer hype
• The “gold rush” around AI apps vs the real winners: hardware makers like Nvidia
• Why subscriptions (ChatGPT, Claude, etc.) resemble the new Netflix model
• Cloud AI Platforms
• Quick demos with Claude and ChatGPT-5
• Fun tests: generating pirate & potato stories, recipes, and coding snippets
• Pros & cons of relying on paid cloud AI
• Running Local AI on Linux
• Installed Ollama backend with one command on Bluefin
• Used Alpaca as a slick front end for managing models
• Loaded GPT-OSS (20B parameters) into VRAM and ran queries locally
• Verified GPU memory usage with Mission Control
• Performance tests: pirate stories, history of the Great Wall, and coding a tic-tac-toe game in HTML/JavaScript — all running locally, no subscriptions needed
• Alternative Interfaces
• Tested LM Studio as an easier app for managing models
• Compared with Alpaca for speed, control, and usability
• Showed how both tools make running LLMs on Linux approachable

💡 Why This Matters

AI doesn’t have to mean expensive subscriptions or hype-driven apps. With a modest Linux workstation, you can:
• Run LLMs locally on your GPU for free
• Use open models like GPT-OSS or Gemma without hitting the cloud
• Save cloud AI for when you truly need it (pay-as-you-go via Groq, OpenAI, etc.)

This episode shows how to turn a £1000 Linux rig into an AI workstation — no drama, no fear, just practical AI that makes your life simpler.

📌 Who This Is For
• Linux users curious about AI without subscriptions
• Developers exploring local coding assistants
• Content creators looking to use AI in workflows without monthly fees
• Windows/macOS switchers wanting to build an affordable AI workstation

🔔 Subscribe

Follow the Linux Out of the Box series as I test more distros and use-cases — from gaming (Bazzite, Aurora) to creative workflows (Bluefin, Ubuntu Studio) — and now AI on Linux desktops.

#Linux #Bluefin #LinuxOutOfTheBox #AI #LocalAI #Ollama #Alpaca #Groq #Claude #ChatGPT #LMStudio #LinuxForCreators

Watch on YouTube →

← All videos