Enable developers to extend Omi’s chat capabilities by exposing their own custom tools (functions) via backend APIs. These tools can be invoked by the chat’s LLM (Large Language Model) during conversation, supporting single or multi-tool (chained) operations—all defined and handled entirely in code, with no need for UI-based tool creation. This turns Omi’s chat from a simple prompt/response system into a powerful, developer-controlled, orchestrated agent platform.
Agent Mate Example:
The developer defines backend tool endpoints like calendar_get_events and notion_append_page. When a user asks in chat, “Check my calendar and write upcoming events into Notion,” the LLM interprets the intent and triggers these tools in sequence via Omi.
Email Automation:
The developer implements tools for “send email,” “attach file,” and “label email.” Users can naturally trigger these from chat; the LLM calls the relevant backend APIs as tools.
SaaS Integrations:
A developer integrates tools for third-party platforms (e.g., “create invoice in accounting system,” “sync customer data to CRM”). Chat commands trigger these tools.
Workflow Chaining:
Users issue a command that requires multiple tools in sequence (e.g., fetch events → filter important ones → export as PDF → upload to Drive).
Enterprise Automation:
Internal teams define secure, business-specific backend tool APIs (e.g., “generate sales report,” “escalate ticket”) that chat users can access via Omi.
Maximum developer control: All logic is implemented in code, with no UI-based tool creation required.
Dynamic function calling: LLM can call any developer-defined function, with real-time parameter passing and chaining.
Advanced workflows: Supports multi-tool workflows and complex logic far beyond basic chat prompts.
Separation of concerns: Chat UI stays clean and user-friendly, while all tool complexity lives in backend APIs.
Extensible ecosystem: Any app or service can expose an unlimited number of tools, each as a well-defined endpoint.
Competitive edge: Transforms Omi chat into an AI-driven automation/orchestration hub, not just a chatbot.
Security: Tool endpoints must validate and authorize requests; Omi should strictly manage authentication, scopes, and data permissions.
API stability: If a developer changes or removes tool endpoints, chat features depending on them may break.
LLM integration: Precise function/parameter schemas are needed for correct intent mapping and function invocation.
Debugging: Multi-tool chaining introduces error propagation; logs and traceability must be robust.
Performance: Chained API calls can add latency; retries and rate-limiting should be managed gracefully.
Documentation: Developers need clear docs and examples to define tools correctly.
Tool Definition:
The developer implements tool endpoints in their backend, each described with a manifest (name, parameters, endpoint, description).
Registration:
The manifest is registered with Omi; the platform knows which tools are available for each developer’s chat integration.
LLM Prompting:
During chat, Omi’s LLM is prompted with the manifest: function names, parameter schemas, and descriptions.
Intent Mapping:
When the user’s message requires tool use, the LLM outputs a function call request (per OpenAI, Gemini, or Anthropic standards).
Relay & Execution:
Omi’s backend receives the tool-call, securely relays it to the developer’s backend endpoint, and returns the result.
Chaining:
If the LLM determines a workflow needs multiple tools, Omi handles sequential calls, passing outputs as needed.
Response:
Results (or errors) are returned to the user in chat, transparently.
Key components:
Manifest File (JSON/YAML): Developers define all available tools, parameters, and endpoint URLs.
Omi Chat API: Accepts chat input, invokes LLM with tool manifest.
LLM (Function-Calling/Tool-Use): Supports developer-defined functions, returns function calls as needed.
Tool Proxy/Dispatcher: Relays function calls from Omi to the correct developer endpoint.
Developer Backend: Implements all logic and responds to tool-call requests.
Chaining Logic: Managed by LLM and Omi, handling multi-step workflows.
Sample Manifest:
{
"tools": [
{
"name": "calendar_get_events",
"description": "Fetch upcoming calendar events.",
"parameters": {
"start_date": "string",
"end_date": "string"
},
"endpoint": "https://agentmate.com/api/tool/calendar_get_events"
},
{
"name": "notion_append_page",
"description": "Append a list of events to a Notion page.",
"parameters": {
"page_id": "string",
"content": "string"
},
"endpoint": "https://agentmate.com/api/tool/notion_append_page"
}
]
}
Sample Tool Endpoint:
Request:
{
"user_id": "abc123",
"parameters": {
"start_date": "2024-05-01",
"end_date": "2024-05-07"
}
} Response (success):
{
"success": true,
"result": [
{
"title": "Game Jam",
"date": "2024-05-02"
},
{
"title": "Unity Training",
"date": "2024-05-05"
}
]
} Response (error):
{
"success": false,
"error": "Invalid Notion API key."
} Please authenticate to join the conversation.
In Review
Feature Requests
10 months ago

Ibrahim Albayrak
Get notified by email when there are changes.
In Review
Feature Requests
10 months ago

Ibrahim Albayrak
Get notified by email when there are changes.