Published onApril 10, 2026Optimizing Ollama Storage: Running LLMs from an External USB-C DriveAIOllamaHardwareStorageOptimizationM1LLMLocal-AIApple-SiliconA guide on moving local LLM models to external storage to reclaim internal SSD space on a Mac while maintaining inference performance.