Welcome to the final chapter of the airi tutorial series!
Let's look back at what we have built:
We have a fully functional digital life form. But there is one scary problem: The Black Box.
Sometimes, your AI will do something strange. It might stare at a wall in Minecraft for 10 minutes. It might reply to "Hello" with "I like cheese." Without a debugging system, you are blind. You don't know why it did that.
In this chapter, we will build the Introspection & Debugging System. Think of this as an MRI machine or a Flight Recorder. It lets us see the raw thoughts inside the AI's brain in real-time.
LLMs (Large Language Models) are non-deterministic. If you give them the same input twice, they might give different answers.
Use Case: Your Minecraft bot keeps jumping into lava.
Now you know the bug isn't the code; it's the AI's confidence in its jumping ability!
Every time the brain "thinks," receives a message, or changes its mood, it emits a Debug Event. These are small packets of data like: {"type": "thought", "message": "I am hungry"}.
This is a dedicated website (running locally) that connects to the AI. It looks like a sci-fi control panel. It shows:
MCP stands for Model Context Protocol. It is a standard way for AI tools to talk to each other. We include an MCP Server so you can use external tools (like an IDE or another AI) to "inspect" your running bot, pause its brain, or manually inject fake events to test how it reacts.
The heart of this system is the DebugService. It acts like a news station. The Brain reports news to it, and the Service broadcasts it to the Dashboard.
In your application code, you simply "report" what is happening.
import { DebugService } from './debug/debug-service'
// Get the singleton instance (the one global reporter)
const debug = DebugService.getInstance()
// Report a simple log
debug.log('INFO', 'I just saw a creeper!')
// Report a complex state change
debug.emitBrainState({
status: 'PANIC',
queueLength: 5,
contextView: 'There is a green monster nearby.'
})
Explanation:
You don't need to know how the dashboard works here. You just send data to the DebugService. It handles the complexity of sending that data over the network.
When you run your airi project, the Debug Server starts automatically (usually on port 3000).
You simply open http://localhost:3000 in your web browser.
You will see the "Control Panel" built in services/minecraft/src/debug/web/index.html. It connects via WebSockets and updates live as your AI thinks.
How does a "thought" inside a Node.js process appear on a website in Chrome?
Let's look at services/minecraft/src/debug/server.ts. This file spins up a WebSocket server to talk to the web dashboard.
// derived from DebugServer.broadcast
public broadcast(event: ServerEvent): void {
// 1. Keep a history in memory (so new tabs see old logs)
this.addToHistory(event)
// 2. Prepare the JSON message
const message = JSON.stringify({
type: 'broadcast',
data: event,
timestamp: Date.now()
})
// 3. Send to every open browser tab
for (const client of this.clients.values()) {
client.ws.send(message)
}
}
Explanation: This loop ensures that if you have the dashboard open on your laptop and your phone, both get the update instantly.
Sometimes viewing logs isn't enough. You want to change things. We use the MCP Server (services/minecraft/src/debug/mcp-repl-server.ts) to let developers inject commands.
Imagine you want to test how the AI handles a rude user, but you don't want to actually be rude. You can "inject" a fake chat message.
// derived from McpReplServer constructor
this.mcpServer.tool('inject_chat',
{
username: z.string(),
message: z.string()
},
async ({ username, message }) => {
// Fake a perception event!
await this.brain.injectDebugEvent({
type: 'perception',
payload: {
type: 'chat_message',
description: `Chat from ${username}: "${message}"`
}
})
return { content: [{ type: 'text', text: 'Injected!' }] }
}
)
Explanation:
inject_chat.perception event.
The dashboard (services/minecraft/src/debug/web/app.js) is a vanilla JavaScript application. It doesn't use heavy frameworks like React because we want it to be extremely fast and robust, even if the main AI is crashing.
It listens for the WebSocket messages we sent earlier.
// derived from app.js
class DebugClient {
connect() {
// Connect to the server
this.ws = new WebSocket('ws://localhost:3000')
// Listen for incoming thoughts
this.ws.onmessage = (event) => {
const message = JSON.parse(event.data)
// Update the correct panel based on event type
if (message.type === 'log') {
logsPanel.addLog(message.payload)
} else if (message.type === 'brain_state') {
brainPanel.update(message.payload)
}
}
}
}
Explanation: This is the receiver. It parses the JSON and routes the data to the correct panel (Logs, Brain State, or Queue).
The Introspection & Debugging System is the lens through which we view our AI's soul.
Without this system, developing a complex AI agent is like flying a plane blindfolded. With it, you can see every thought, every decision, and every error as it happens.
Congratulations! You have read through the architecture of airi.
We have covered:
You now have the conceptual blueprint to buildβor contribute toβa fully autonomous AI agent that lives on your computer, remembers you, and interacts with your digital world.
Happy Coding!
Generated by Code IQ