Custom LLM provider support

Add the option to change the LLM API provider, mainly to be able to plug in a Ollama instance (or other OpenAI compatible API).

With this, we’ll be able to have a fully local and private Omi workflow

Please authenticate to join the conversation.

Upvoters
Status

In Review

Board
💡

Feature Requests

Tags

High Priority

Date

2 months ago

Author

An Anonymous User

Subscribe to post

Get notified by email when there are changes.