In Chapter 1: Agent Adapters, we gave our AI a way to interact with the world (via Telegram or Minecraft). In Chapter 2: The Cognitive Brain, we gave it the ability to think and make decisions.
Now we have a thinking entity that can send text messages, but it is invisible. It has no face.
In this chapter, we will build The Stage. This is the visual interface where your character actually "lives." Whether you are looking at your AI on a website, on your phone, or as a desktop pet, they need a place to stand, move, and smile.
Imagine you want to launch your AI character on three different devices:
The Problem: Rendering 3D models (VRM) or complex 2D animations (Live2D) is hard. If you write the code to load a 3D model for the Website, you don't want to rewrite it entirely for the Desktop app.
The Solution: We create a shared "Stage." Think of it like a traveling theater troupe. The Stage includes the lighting, the actors, the costumes, and the scripts. We just pack this Stage into different boxes (Web browser, Electron app, Mobile app).
To understand the Stage, think of a theater production.
This is the specific app wrapper.
apps/stage-web: The web browser wrapper.apps/stage-tamagotchi: The desktop window wrapper.apps/stage-pocket: The mobile phone wrapper.These containers handle device-specific things (like checking battery level on a phone or transparency on a desktop), but they all display the same Stage.
This is the core library (packages/stage-ui). It contains the common UI elements: the chat bubbles, the settings menu, and the canvas where the character stands.
This is the character itself. Airi supports two types of actors:
The magic happens because all the specific apps import the same core components.
Let's look at apps/stage-web/src/App.vue. This is the entry point for the website version.
<script setup lang="ts">
import { RouterView } from 'vue-router'
// We import the shared transition logic
import { StageTransitionGroup } from '@proj-airi/ui-transitions'
</script>
<template>
<!-- This wrapper handles page transitions and themes -->
<StageTransitionGroup :colors="colors">
<!-- The "RouterView" loads the actual Stage scene -->
<RouterView />
</StageTransitionGroup>
</template>
Explanation:
This code is incredibly simple because all the hard work is hidden. The StageTransitionGroup applies the visual theme (colors, dark mode), and RouterView loads the actual character scene from the shared library.
If you looked at apps/stage-pocket/src/App.vue, you would see almost the exact same code! This is the power of the Stage abstraction.
What happens when the Stage loads? How does a file on your hard drive become a breathing character?
The heavy lifting is done in packages/stage-ui-three/src/components/Model/VRMModel.vue.
This component wraps the Three.js library to make loading 3D models easy.
When the component mounts, it fetches the model file.
// derived from VRMModel.vue
async function loadModel() {
// 1. Use a helper to load the VRM file into the 3D scene
const _vrmInfo = await loadVrm(modelSrc.value, {
scene: scene.value, // The 3D world
lookAt: true, // Enable eye tracking
})
// 2. Save the loaded model to our variable
vrm.value = _vrmInfo._vrm
// 3. Tell the rest of the app "I am ready!"
emit('loaded', modelSrc.value)
}
Explanation:
We don't deal with raw vertices or textures here. We call loadVrm, which handles the parsing. Once loaded, we emit a loaded event so the loading screen can disappear.
A static model looks like a statue. We need to apply an "Idle" animation (breathing, slight swaying) so it feels alive.
// derived from VRMModel.vue
// Load the animation file
const animation = await loadVRMAnimation(idleAnimation.value)
// Create a mixer (this blends animations together)
vrmAnimationMixer.value = new AnimationMixer(_vrm.scene)
// Play the clip!
vrmAnimationMixer.value.clipAction(clip).play()
The defining feature of airi is that the character reacts to you. In the code, we "watch" for changes in the environment (like mouse position) and update the model's eyes.
// derived from VRMModel.vue
// Watch the "trackingMode" setting
watch(trackingMode, (newMode) => {
if (newMode === 'mouse') {
// If tracking mouse, update target when mouse moves
watch([mouseX, mouseY], ([newX, newY]) => {
// Calculate where the mouse is in the 3D world
const target = lookAtMouse(newX, newY, camera)
// Tell the model to look there
emit('lookAtTarget', target)
})
}
})
Explanation: This is Vue.js reactivity in action.
trackingMode to change.Not everyone wants a 3D character. airi also supports Live2D (common in anime games). The architecture is identical, but the engine changes.
In packages/stage-ui-live2d/src/components/scenes/Live2D.vue, we see the same pattern:
<template>
<!-- The Canvas for 2D drawing -->
<Live2DCanvas :width="width" :height="height">
<!-- The Model Component -->
<Live2DModel
:model-src="modelSrc"
:focus-at="focusAt" <!-- Where to look -->
:mouth-open-size="mouthOpenSize" <!-- Lip syncing -->
/>
</Live2DCanvas>
</template>
Because the Stage abstracts the differences, the Cognitive Brain doesn't care if the body is 2D or 3D. It just sends a command like "Smile," and the Stage figures out how to render that smile on the specific model.
The Stage is the visual presentation layer of the project.
Now that our character has a Brain, a Body, and a Face, we need to make sure it remembers who you are.
Next Chapter: Central Data & Identity Server
Generated by Code IQ