Hi everyone ![]()
Weโre excited to share a major update from the Genum team โ this one is big.
| Update | Description |
|---|---|
| Open Source & Self-Hosted | Genum is now open source |
| Custom LLM Providers | Connect local models (LM Studio, Ollama, self-hosted endpoints), integrate non-OpenAI vendors, and run fully offline or hybrid setups. No lock-in โ your models, your infrastructure. Learn more |
| New OpenAI & Gemini Models | Use the latest OpenAI and Gemini models directly inside your existing Genum workflows, without migration or reconfiguration. |
Join Us
Genum is built in the open โ feedback, issues, PRs, and ideas are more than welcome.
Thanks for being part of the journey ![]()
Letโs build practical, production-ready AI together.
The Genum Team