Ollama 0.9.5: Revolutionary LLM app for macOS with many new features!

Transparenz: Redaktionell erstellt und geprüft.
Veröffentlicht am

Discover the latest features of Ollama 0.9.5: improved performance for LLMs on macOS, Windows and Linux.

Entdecken Sie die neuesten Funktionen von Ollama 0.9.5: verbesserte Performance für LLMs auf macOS, Windows und Linux.
Discover the latest features of Ollama 0.9.5: improved performance for LLMs on macOS, Windows and Linux.

Ollama 0.9.5: Revolutionary LLM app for macOS with many new features!

The world of artificial intelligence has opened a new chapter with the release of the updated version 0.9.5 of the open source tool Ollama. As of July 5, 2025, this software is now available for local management of Large Language Models (LLM). How stadt-bremerhaven.de reported, the tool has been fundamentally revised and now comes as a native macOS application, which brings noticeable improvements. These include faster start times and reduced memory requirements.

Particularly impressive is Electron's move to a native macOS app, which reduces install size by about 25 percent. The new features in version 0.9.5 are also worth mentioning: for example, there is now a new settings window for sharing language models on the network. This allows access to resource-intensive LLMs from powerful computers, while other devices in the network can benefit. In addition, the models can now be saved on external storage media or in alternative directories.

Versatile uses and advantages

Ollama's versatility isn't just limited to macOS. The tool is also available for Windows and Linux, thus providing a unified interface for downloading, managing and running LLMs on different operating systems, such as markaicode.com highlights. More and more users are discovering the benefits of providing LLMs locally. This means that all data processing remains on your own computer, which ensures complete privacy. In addition, there are no costs for API usage and the models can also be operated offline.

The hardware requirements are moderate, so even users with older gaming GPUs or notebooks can venture into the world of LLMs. A modern multi-core processor and at least 8 GB of RAM are recommended to make optimal use of the performance. The list of supported models includes Llama 2, Code Llama, Mistral and Vicuna, so that the right model can be found for almost every application.

Open source as a game changer

The other advantages of open source LLMs include independence from external providers. fraunhofer.de points out that sensitive data remains in your own network and the costs of operation are transparent and predictable. These advanced technologies have seen significant progress throughout 2023, fueling interest in generative AI tools. Especially with the advent of ChatGPT from OpenAI, the demand for LLMs has proven to be huge.

The ability to adapt models to specific use cases via fine-tuning opens up new horizons for companies and developers. In addition, new performance optimization techniques are continually being added that allow acceptable speeds even with less powerful hardware.

With the new version of Ollama, entry into the world of artificial intelligence becomes even easier for many users. It remains exciting to see how the technology will continue to develop and what new opportunities it will create. The combination of ease of use and powerful features makes Ollama a tempting offer for anyone working in the field of artificial intelligence.