I'm running Omi with a custom webhook integration (OpenClaw AI assistant framework on a Mac Mini). Transcripts stream through a local proxy that routes them into a persistent AI agent with memory, home automation, and ambient awareness.
The problem: there's no retry mechanism or delivery acknowledgment for webhooks. If my gateway is briefly down (reboot, deploy, crash), transcripts are silently lost forever. There's no way to know something was missed.
What I'd like:
- Retry with backoff when the webhook endpoint returns non-200 or is unreachable
- A short buffer (even 5-10 min) of unacknowledged payloads
- Delivery receipt / acknowledgment protocol so the sender knows the payload was received
- Optionally, a dead-letter queue or "missed webhooks" endpoint to retrieve what was lost
This is the biggest reliability gap for anyone building serious integrations on top of Omi. The device is great at capturing — but if the pipe leaks, the data is gone.
Please authenticate to join the conversation.
In Review
Feature Requests
High Priority
4 days ago

samshields-oc
Get notified by email when there are changes.
In Review
Feature Requests
High Priority
4 days ago

samshields-oc
Get notified by email when there are changes.