OrKa
Open-source tool for building AI workflows using YAML configuration instead of Python code, with built-in memory and local LLM support.
At a Glance
Pricing
Free and open-source under Apache 2.0 license
Engagement
Available On
About OrKa
OrKa is an open-source AI orchestration tool that simplifies building AI workflows by using YAML configuration files instead of complex Python code. It serves as a streamlined alternative to frameworks like CrewAI or LangChain, focusing on simplicity and ease of use. With OrKa, developers can chain AI agents together by describing what they want in simple configuration files, making AI workflow development accessible to a broader audience.
- YAML Configuration – Define AI workflows in simple YAML files instead of writing Python code, reducing complexity and development time significantly.
- Built-in Memory – Features semantic memory that remembers conversations and facts, searches semantically, and automatically cleans up old data with configurable retention.
- Local LLM Support – Works with Ollama, LM Studio, and other local models for privacy-conscious deployments without sending data to external APIs.
- Smart Routing – Supports conditional logic, parallel processing, and loops all defined within YAML configuration files.
- Web Search Integration – Built-in web search agents provide access to up-to-date information for research and Q&A workflows.
- Simple Setup – Docker-based deployment with Redis for memory storage; install with pip and start building workflows in three commands.
- Memory Management – Includes a TUI (Terminal User Interface) for watching memory operations, automatic cleanup, and semantic search capabilities.
- Open Source – Released under Apache 2.0 license, available on GitHub, and community-driven development.
To get started with OrKa, install it via pip with pip install orka-reasoning==0.9.12, then run orka-start to initialize Redis and the UI, and finally execute workflows with orka run my-workflow.yml "Your query here". Common use cases include Q&A systems with memory, local AI chat applications, research assistants, content pipelines, and data processing workflows. The tool supports LM Studio-style endpoints and provides comprehensive documentation with examples and API references.

Community Discussions
Be the first to start a conversation about OrKa
Share your experience with OrKa, ask questions, or help others learn from your insights.
Pricing
Free Plan Available
Free and open-source under Apache 2.0 license
- YAML workflow configuration
- Built-in memory
- Local LLM support
- Smart routing
- Web search integration
Capabilities
Key Features
- YAML-based workflow configuration
- Built-in semantic memory
- Local LLM support (Ollama, LM Studio)
- Smart routing with conditional logic
- Parallel processing
- Loop support in workflows
- Web search integration
- Docker-based deployment
- Redis memory storage
- Terminal User Interface (TUI)
- Automatic memory cleanup
- Semantic search
- CLI tools
Integrations
Demo Video
