HoML Logo

HoML

Introducing HoML: Run Local AI with Ease and Speed

Published on 2025-08-09 by The HoML Team

Introducing HoML: Run Local AI with Ease and Speed

We are incredibly excited to share something we've been working on: HoML is now live and ready for you to try at https://homl.dev!

What is HoML?

HoML is a new tool designed to make running powerful, open-source AI models on your own computer as simple as possible. And in case you were wondering, "HoML" is just a name, not an acronym!

Our goal with HoML is to combine the wonderful, user-friendly experience of Ollama with the high-performance inference engine of vLLM. We love what both projects are doing and wanted to bring their strengths together to create a tool that is both easy to use and incredibly fast.

Key Features

With HoML, you get an intuitive CLI that just works, allowing you to:

  • Get started in minutes with a one-line installation script.
  • Easily download models from the Hugging Face Hub with a simple homl pull command.
  • Chat with your models directly in your terminal using homl run.
  • Integrate with your existing tools through a built-in, OpenAI-compatible API.
  • Run everything offline, keeping your data private and secure on your machine.

This is Just the Beginning

We're launching HoML today in beta. We've implemented the core features we envisioned, but we know there's still work to be done and rough edges to smooth out.

We truly believe in the power of open-source and community. We're building HoML in the open, and we're humbled to be standing on the shoulders of giants like the teams behind vLLM and Ollama. We are excited to see what you build, create, and discover with HoML.

How to Get Involved

Your feedback, bug reports, and contributions will be invaluable as we continue to improve the platform.

  • Report issues: If you encounter any bugs or have a feature request, please open an issue on our GitHub Issues page.
  • Contribute: We welcome contributions! Check out our project on GitHub and feel free to open a pull request.

Join us on this journey and help us shape the future of local AI.

Get started now at https://homl.dev.