Chapter 7 Β· CORE

Introspection & Debugging System

πŸ“„ 07_introspection___debugging_system.md 🏷 Core

Chapter 7: Introspection & Debugging System

Welcome to the final chapter of the airi tutorial series!

Let's look back at what we have built:

  1. Body: Agent Adapters (Minecraft, Discord).
  2. Mind: The Cognitive Brain (Thinking).
  3. Face: The "Stage" (Visual Presentation Layer) (VRM/Live2D).
  4. Memory: Central Data & Identity Server (Database).
  5. Ears: Sensory Audio Processing (Hearing) (Listening).
  6. Hands: Native Capabilities Bridge (Desktop Control).

We have a fully functional digital life form. But there is one scary problem: The Black Box.

Sometimes, your AI will do something strange. It might stare at a wall in Minecraft for 10 minutes. It might reply to "Hello" with "I like cheese." Without a debugging system, you are blind. You don't know why it did that.

In this chapter, we will build the Introspection & Debugging System. Think of this as an MRI machine or a Flight Recorder. It lets us see the raw thoughts inside the AI's brain in real-time.

The Motivation: Why did you do that?

LLMs (Large Language Models) are non-deterministic. If you give them the same input twice, they might give different answers.

Use Case: Your Minecraft bot keeps jumping into lava.

Now you know the bug isn't the code; it's the AI's confidence in its jumping ability!

Key Concepts

1. The Flight Recorder (Event Bus)

Every time the brain "thinks," receives a message, or changes its mood, it emits a Debug Event. These are small packets of data like: {"type": "thought", "message": "I am hungry"}.

2. The Dashboard (Visualizer)

This is a dedicated website (running locally) that connects to the AI. It looks like a sci-fi control panel. It shows:

3. MCP (The Universal Remote)

MCP stands for Model Context Protocol. It is a standard way for AI tools to talk to each other. We include an MCP Server so you can use external tools (like an IDE or another AI) to "inspect" your running bot, pause its brain, or manually inject fake events to test how it reacts.

How to Use: The Debug Service

The heart of this system is the DebugService. It acts like a news station. The Brain reports news to it, and the Service broadcasts it to the Dashboard.

In your application code, you simply "report" what is happening.

import { DebugService } from './debug/debug-service'

// Get the singleton instance (the one global reporter)
const debug = DebugService.getInstance()

// Report a simple log
debug.log('INFO', 'I just saw a creeper!')

// Report a complex state change
debug.emitBrainState({
  status: 'PANIC',
  queueLength: 5,
  contextView: 'There is a green monster nearby.'
})

Explanation: You don't need to know how the dashboard works here. You just send data to the DebugService. It handles the complexity of sending that data over the network.

Viewing the Dashboard

When you run your airi project, the Debug Server starts automatically (usually on port 3000). You simply open http://localhost:3000 in your web browser.

You will see the "Control Panel" built in services/minecraft/src/debug/web/index.html. It connects via WebSockets and updates live as your AI thinks.

Internal Implementation: The Data Flow

How does a "thought" inside a Node.js process appear on a website in Chrome?

The Pipeline

sequenceDiagram participant Brain as Cognitive Brain participant Service as Debug Service participant Server as Debug Server (WebSocket) participant Dashboard as Web Browser Brain->>Service: log("Thinking about food...") Service->>Server: broadcast(event) par Broadcast Server->>Server: Save to log file (disk) Server->>Dashboard: Send JSON via WebSocket end Dashboard->>Dashboard: Update UI (Add line to log)

Deep Dive 1: The Debug Server

Let's look at services/minecraft/src/debug/server.ts. This file spins up a WebSocket server to talk to the web dashboard.

// derived from DebugServer.broadcast
public broadcast(event: ServerEvent): void {
  // 1. Keep a history in memory (so new tabs see old logs)
  this.addToHistory(event)

  // 2. Prepare the JSON message
  const message = JSON.stringify({
    type: 'broadcast',
    data: event,
    timestamp: Date.now()
  })

  // 3. Send to every open browser tab
  for (const client of this.clients.values()) {
    client.ws.send(message)
  }
}

Explanation: This loop ensures that if you have the dashboard open on your laptop and your phone, both get the update instantly.

Deep Dive 2: The MCP Repl (The Probe)

Sometimes viewing logs isn't enough. You want to change things. We use the MCP Server (services/minecraft/src/debug/mcp-repl-server.ts) to let developers inject commands.

Imagine you want to test how the AI handles a rude user, but you don't want to actually be rude. You can "inject" a fake chat message.

// derived from McpReplServer constructor
this.mcpServer.tool('inject_chat',
  { 
    username: z.string(), 
    message: z.string() 
  },
  async ({ username, message }) => {
    // Fake a perception event!
    await this.brain.injectDebugEvent({
      type: 'perception',
      payload: {
        type: 'chat_message',
        description: `Chat from ${username}: "${message}"`
      }
    })

    return { content: [{ type: 'text', text: 'Injected!' }] }
  }
)

Explanation:

  1. We define a tool named inject_chat.
  2. When called, it creates a fake perception event.
  3. The Cognitive Brain receives this event and thinks it's real. It reacts to the fake message exactly as if it were real.

Deep Dive 3: The Frontend Dashboard

The dashboard (services/minecraft/src/debug/web/app.js) is a vanilla JavaScript application. It doesn't use heavy frameworks like React because we want it to be extremely fast and robust, even if the main AI is crashing.

It listens for the WebSocket messages we sent earlier.

// derived from app.js
class DebugClient {
  connect() {
    // Connect to the server
    this.ws = new WebSocket('ws://localhost:3000')

    // Listen for incoming thoughts
    this.ws.onmessage = (event) => {
      const message = JSON.parse(event.data)
      
      // Update the correct panel based on event type
      if (message.type === 'log') {
        logsPanel.addLog(message.payload)
      } else if (message.type === 'brain_state') {
        brainPanel.update(message.payload)
      }
    }
  }
}

Explanation: This is the receiver. It parses the JSON and routes the data to the correct panel (Logs, Brain State, or Queue).

Summary

The Introspection & Debugging System is the lens through which we view our AI's soul.

  1. It uses a Debug Service to collect events from the Brain.
  2. It uses a WebSocket Server to broadcast these events to a Web Dashboard.
  3. It uses an MCP Server to allow developers to inject fake events and test behaviors.

Without this system, developing a complex AI agent is like flying a plane blindfolded. With it, you can see every thought, every decision, and every error as it happens.


Conclusion to the Series

Congratulations! You have read through the architecture of airi.

We have covered:

  1. Bodies (Adapters)
  2. Minds (Cognitive Brain)
  3. Faces (Stage)
  4. Memories (Data Server)
  5. Senses (Hearing & Desktop Control)
  6. Introspection (Debugging)

You now have the conceptual blueprint to buildβ€”or contribute toβ€”a fully autonomous AI agent that lives on your computer, remembers you, and interacts with your digital world.

Happy Coding!


Generated by Code IQ