* update README ruby link
the ollama-ai ruby gem is vastly less popular and seems unmaintained
https://rubygems.org/gems/ollama-ai
the defacto standard with the most downloads in the ruby ecosystem is ruby_llm
https://rubygems.org/gems/ruby_llm
I would link to that to avoid complication and guarantee feature compatibility with ollama.
* Update gem link ruby_llm from website to GitHub
ollama links mostly to github, not project websites, hence link to ruby_llm github.
* MLX - dynamic loading of mlx-c
Create a wrapper layer to indirect the dependency on mlx-c so
the main ollama binary does not have a load-time dependency on mlx-c, mlx, and on linux, cuda. Lazy load the library via dlopen
so we can adjust the path to ensure the dependencies are found
and fail gracefully if not present.
* review comments
* fix broken tests
Void is an open source AI code editor and Cursor alternative that supports
Ollama. It's built on VS Code and allows users to connect directly to Ollama
for private LLM usage without going through a middleman backend.
Key features:
- Open source Cursor alternative
- Direct Ollama integration
- VS Code fork with full compatibility
- Agent mode and MCP support
- Works with any open source model
Fixes#12919
Signed-off-by: Samaresh Kumar Singh <ssam3003@gmail.com>
Co-authored-by: A-Akhil <akhilrahul70@gmail.com>
This PR introduces a new ollama embed command that allows users to generate embeddings directly from the command line.
Added ollama embed MODEL [TEXT...] command for generating text embeddings
Supports both direct text arguments and stdin piping for scripted workflows
Outputs embeddings as JSON arrays (one per line)
This commit updates the README to include macLlama within the community integrations section.
macLlama is a native macOS application built for lightweight and efficient LLM interaction. Key features include:
* **Lightweight & Native:** Designed to be resource-friendly and perform optimally on macOS.
* **Chat-like Interface:** Provides a user-friendly, conversational interface.
* **Multiple Window Support:** Allows users to manage multiple conversations simultaneously.
The primary goal of macLlama is to offer a simple and easy-to-run LLM experience on macOS.
This PR adds Tiny Notepad, a lightweight, notepad-like interface to chat with local LLMs via Ollama.
- It’s designed as a simple, distraction-free alternative.
- The app supports basic note-taking, timestamped logs, and model parameter controls.
- Built with Tkinter, it runs entirely offline and available via PyPI.
Aims to provide a lightweight easy to run and install interface for ollama.