Skip to Content
DocumentationFirst-Party AppsπŸ›°οΈ Even Realities

πŸ›°οΈ Even Realities

Heads-up access to your local AI. Connect Even Realities and OMI Glasses to your Companion Hub for ambient visual logging and always-on agent conversations β€” without pulling out a screen.

Overview

The CI-Even-Realities integration bridges Even Realities smart glasses (and compatible OMI Glasses) with your local Companion Hub. The glasses display a minimal HUD overlay with agent responses, notifications, and contextual information pulled directly from your Digital Memory Server, while feeding visual and audio context back into your memory layer.

Unlike cloud-connected AR experiences, all inference happens on your local hardware. The glasses act purely as an input/output peripheral; your Hub handles all AI processing.

Key Features

  • HUD overlays β€” display agent responses, reminders, and contextual info in your field of view
  • Voice-activated agent β€” trigger your Companion Agent with a wake word and receive spoken + visual responses
  • Ambient capture β€” optional photo and audio logging to Digital Memory
  • Notification mirror β€” surface Hub app alerts (Home Assistant, Uptime Kuma) as HUD notifications
  • Agent sidebar mode β€” persistent agent panel in the HUD for ongoing task assistance
  • OMI Glasses compatibility β€” works with both Even Realities G1 and OMI glasses hardware

Supported Hardware

DeviceConnectionStatus
Even Realities G1Bluetooth LEβœ…
OMI GlassesBluetooth LEβœ…
Even Realities G1 ProUSB-C + BTπŸ”„ In progress

Use Cases

  • Ask your local AI a question hands-free and see the answer in your glasses
  • Get walking or cooking directions from your agent without looking at a phone
  • Receive Hub alerts (server events, door sensors via Home Assistant) as HUD notifications
  • Capture ambient visual context to enrich your Digital Memory timeline
  • Use in combination with Omi Pendant for richer conversational memory

Architecture

Even Realities / OMI Glasses β”‚ Bluetooth LE β–Ό CI-Even-Realities Bridge (Hub app) β”‚ β”œβ”€β”€β–Ά Audio input ──▢ Local Whisper ──▢ Companion Agent β”‚ β”‚ │◀──────────── Agent response (text/TTS) β—€β”€β”€β”˜ β”‚ β”œβ”€β”€β–Ά Photo capture ──▢ Digital Memory └──▢ Display command (HUD overlay) ──▢ Glasses display

Setup

Install from Hub app store

Search for Even Realities in Companion Hub and install.

Pair glasses

  1. Open Even Realities app on your phone (required for initial firmware setup)
  2. Once glasses are configured and on your network, open http://even-realities.ci.localhost
  3. Navigate to Devices β†’ Pair and follow the Bluetooth pairing steps

Configure HUD display

Under Display Settings, choose:

  • Font size and placement for agent responses
  • Notification types to mirror to HUD
  • Brightness and contrast preferences

Set up voice agent

In Agent Settings, configure:

  • Wake word (e.g., β€œHey Companion”)
  • Response mode: Text only, Voice + Text, or Voice only
  • LLM endpoint (defaults to your Hub’s Ollama or Open WebUI instance)

Usage

Agent Conversation

Say your wake word followed by a question. The response appears in your HUD within seconds. Responses are also logged to your Digital Memory session history.

HUD Notifications

Notifications from configured Hub apps appear as brief overlays in your lower visual field. Tap the glasses frame to dismiss.

Photo Capture

Single-tap the frame to capture a photo, which is immediately ingested into Digital Memory.

Troubleshooting

HUD text not visible Adjust brightness in Display Settings. If in bright sunlight, enable High Contrast Mode.

Wake word not detected Ensure microphone access is granted to the CI-Even-Realities container. Test the microphone in Agent Settings β†’ Microphone Test.

Glasses disconnecting frequently Keep your Hub within 10 meters line-of-sight. Bluetooth interference from other devices can cause drops. Try switching the glasses to a less congested Bluetooth channel in the advanced settings.

Last updated on