Open Source AI Platform - AI Chat with advanced features that works with every LLM
npx skills add https://github.com/onyx-dot-app/onyx --skill image-generationInstallieren Sie diesen Skill über die CLI und beginnen Sie mit der Verwendung des SKILL.md-Workflows in Ihrem Arbeitsbereich.
Onyx is the application layer for LLMs - bringing a feature-rich interface that can be easily hosted by anyone.
Onyx enables LLMs through advanced capabilities like RAG, web search, code execution, file creation, deep research and more.
Connect your applications with over 50+ indexing based connectors provided out of the box or via MCP.
[!TIP]
Deploy with a single command:curl -fsSL https://onyx.app/install_onyx.sh | bash

Onyx supports all major LLM providers, both self-hosted (like Ollama, LiteLLM, vLLM, etc.) and proprietary (like Anthropic, OpenAI, Gemini, etc.).
To learn more - check out our docs!
Onyx supports deployments in Docker, Kubernetes, Helm/Terraform and provides guides for major cloud providers.
Detailed deployment guides found here.
Onyx supports two separate deployment options: standard and lite.
The Lite mode can be thought of as a lightweight Chat UI. It requires less resources (under 1GB memory) and runs a less complex stack.
It is great for users who want to test out Onyx quickly or for teams who are only interested in the Chat UI and Agents functionalities.
The complete feature set of Onyx which is recommended for serious users and larger teams. Additional components not included in Lite mode:
[!TIP]
To try Onyx for free without deploying, visit Onyx Cloud.
Onyx is built for teams of all sizes, from individual users to the largest global enterprises:
There are two editions of Onyx:
For feature details, check out our website.
Join our open source community on Discord!
Looking to contribute? Please check out the Contribution Guide for more details.