Skip to Content
DocumentationFirst-Party AppsπŸ€– Avatar Companion

πŸ€– Avatar Companion

Bring your AI into the physical world. Interact with your local Companion Agent as an embodied avatar across Unity and WebXR environments.

Overview

Avatar Companion gives your Companion AI a visual, spatial presence. Instead of interacting with your agent through a text chat interface, you engage with a 3D avatar that lives in a shared virtual space β€” on your laptop browser (WebXR), in a dedicated app, or on a mixed reality headset.

The avatar renders locally, queries your Hub’s inference stack for responses, and uses the Digital Memory Server for context. All rendering and inference is done on your hardware.

Key Features

  • WebXR support β€” runs in any WebXR-capable browser with no app install
  • Unity client β€” higher fidelity avatar with advanced animation and lip sync (Quest, Pico, Vision Pro)
  • Voice interaction β€” speak to your avatar; it responds with synthesized speech (local Piper TTS)
  • Memory-aware β€” avatar accesses Digital Memory context for personalized, contextual conversations
  • Persistent presence β€” avatar can persist in a dedicated virtual room you return to
  • Customizable appearance β€” choose from built-in avatar styles or import custom .vrm models
  • Headset platforms β€” Quest 3, Pico XR, Vision Pro, Zappar Zapbox

Supported Platforms

PlatformClientStatus
Desktop browser (Chrome, Edge)WebXRβœ…
Meta Quest 3Unity + WebXRβœ…
Pico XRUnity + WebXRβœ…
Apple Vision ProUnity + WebXRβœ…
Zappar ZapboxWebXRβœ…
iOS/Android (AR)WebXRβœ…

Architecture

WebXR / Unity Client β”œβ”€β”€ 3D Avatar Renderer β”œβ”€β”€ WebRTC Audio └── XR Input Handler β”‚ WebSocket (local LAN) β–Ό Avatar Companion Hub Service β”œβ”€β”€ Session manager β”œβ”€β”€ Speech-to-text (Whisper) β”œβ”€β”€ Agent router ──▢ Ollama / Companion Agent β”œβ”€β”€ Text-to-speech (Piper) └── Memory bridge ──▢ Digital Memory Server

Setup

Install from Hub

Search for Avatar Companion in the Hub app store and install.

Open the WebXR client

Navigate to http://avatar.ci.localhost in a WebXR-capable browser. Click Enter to launch the experience. No headset required for desktop use.

Configure your avatar

In Settings β†’ Avatar, select your preferred avatar appearance. For custom .vrm avatars, upload via the avatar manager.

Configure voice

In Settings β†’ Voice:

  • Select Whisper model for speech recognition
  • Select Piper voice for TTS
  • Test the audio pipeline with Settings β†’ Test Voice

(Optional) Install Unity client for headsets

Download the Unity client app from:

  • Quest 3 / Pico: Side Quest or platform app store
  • Vision Pro: TestFlight link (see Hub dashboard β†’ Avatar Companion β†’ Install)

Usage

Starting a Session

Open http://avatar.ci.localhost and click Enter Environment. You are placed in a default virtual room with your avatar present and ready to converse.

Voice Conversation

Click the microphone button (or say the wake phrase) to start talking. The avatar lip-syncs to its spoken response in real time.

Entering XR Mode

In a WebXR browser or on a headset, click Enter XR to switch to immersive mode. On a headset, use hand tracking or controllers to interact.

Switching Environments

Navigate to Environments to choose a different virtual room, or import a custom environment as a .glb file.

Troubleshooting

Avatar not loading Check that the Avatar Companion service is running in Hub. Reload the page and wait for the initial asset load (first launch downloads avatar assets).

Voice input not recognized Ensure your browser has microphone permission for avatar.ci.localhost. Check Whisper is running in Hub.

Avatar lips not syncing TTS and lip sync require a compatible Piper voice with phoneme output. Verify the selected voice supports phoneme data in Settings β†’ Voice.

Headset connection refused Ensure both the headset and your Hub are on the same Wi-Fi network. Open http://avatar.ci.localhost on the headset’s browser to verify connectivity before launching the Unity app.

Last updated on