The developers at Ollama released their own graphical application for MacOS and Windows, allowing various large language AI models to run directly on the computer. Previously, Ollama operated only via the command line, but now users have a user-friendly interface that requires no special knowledge. The app offers the ability to choose models from a list, including Gemma, DeepSeek, and Qwen, as well as download other models via the command line.
After installing Ollama, the user can select the desired model from a dropdown list next to the query line. Loaded models appear in this list, allowing quick switching between them. The app is free for MacOS and Windows but currently does not support Linux. To download, simply download the installer here , install the app, and follow the simple instructions.
Ollama allows working with AI locally, which enhances privacy and reduces dependency on cloud services. Users can instantly execute queries without transmitting their data to third-party services. The program is suitable for those who value personal information protection or aim to reduce internet resource usage.
The Ollama interface is intuitive, making it suitable for both beginners and experienced users. All models available in the Ollama library can be used at no additional cost, making the app a convenient tool for testing the capabilities of local AI on a home or work computer.