Requirements to run a Private AI

AI can run on anything, a laptop, an old desktop, anything.

But you are constrained by speed and model sizes with normal home computers.

AI cares about two things, context (RAM) and thinking (GPU).

AI PC’s are normal PC’s except they have powerful GPUs with a lot of VRAM and / or unified memory. Typically starting at 64GB, but getting to 128GB or more of RAM.

If you have a modest laptop or desktop, you can run something like “LM Studio” to get a good idea.

Client server models such as Ollama and Open WebUI are popular as well.

Once you’ve run some models on this, you can decide which direction you wish to go. There are two camps:

1. CUDA: Closed and owned by Nvidia, but significantly faster and more compatible than open architecture.

2. Open Source: Runs on CPU’s, and non Nvidia GPUs, but often lags behind the curve on performance, support and compatibility.

If you want a relatively easy buy and forget and don’t worry about being locked in, then the Nvidia DGX Spark, at $3,600, is an AI in a box. It requires some technical knowledge to setup and is primarily aimed at AI developers, but it can run genuinely useful models and give easy access to the tools to make those models useful (search, document analysis, image generation, agentic tool access etc..)

The other option is to build your own or buy an off the shelf PC designed for AI. Realistically these start becoming useful at about $3,000 and go up to around $20,000 for a full system.