I served a 200 billion parameter LLM from a Lenovo workstation the size of a Mac Mini Tech Tools & Mobile / Apps February 17, 2026 You can run local AI inference on, more or less, any machine that you have access to. From a modestly-specced…