source: techcrunch ai: osaurus brings both local and cloud ai models to your mac
level: technical
osaurus is an open-source llm server built only for mac. it started from a desktop ai companion idea called dinoki. the co-founder terence pae said customers did not want to pay for tokens on top of buying an app. that led to a focus on running ai locally. the tool lets users move between different local ai models or connect to cloud services like openai and anthropic. all files and tools stay on the user's own machine.
the app works as a control layer that connects various ai models through one interface. it is not just for developers. it has a simple interface for consumers and runs models in a hardware-isolated virtual sandbox for security. this limits what the ai can access on your computer. users can pick the model that fits their task since different models have different strengths. it supports many local models like llama, deepseek v4, and apple on-device models. it also connects to cloud providers and works as a full model context protocol server with over 20 native plugins for mail, calendar, browser, and more.
running ai locally needs a lot of ram. the system requires at least 64 gb, and larger models need about 128 gb. pae believes local ai efficiency will improve over time. he noted that local ai has gone from barely finishing sentences to running tools and writing code. the project has been downloaded over 112,000 times. the founders are in an accelerator and thinking about business uses in legal and healthcare where privacy matters. they also see local ai as a way to reduce reliance on power-hungry data centers by using on-premise mac studios.
why it matters: it gives ai and data science users a flexible way to run models locally for privacy or switch to cloud when needed, all from one mac app.
source: techcrunch ai: osaurus brings both local and cloud ai models to your mac