Add the option to change the LLM API provider, mainly to be able to plug in a Ollama instance (or other OpenAI compatible API).
With this, we’ll be able to have a fully local and private Omi workflow
Please authenticate to join the conversation.
In Review
Feature Requests
High Priority
2 months ago

An Anonymous User
Get notified by email when there are changes.
In Review
Feature Requests
High Priority
2 months ago

An Anonymous User
Get notified by email when there are changes.