Welcome to the final chapter of the deer-flow tutorial!
In Chapter 6: Long-Term Memory Updater, we gave our AI a permanent memory so it remembers you. Before that, we built the Brain (Lead Agent), the Hands (Skills), the Workers (Sub-Agents), and the Safety Lab (Sandbox).
We have built a powerful engine. But currently, all these parts are hidden deep inside Python scripts and configuration files.
If the Frontend (built in Chapter 1) wants to know: "Which AI models are available?" or "What skills can I use?", it cannot read your backend's config.yaml file directly. That would be insecure and messy.
In this chapter, we build the Gateway API. This is the "Receptionist" of our system—a clean, unified interface that connects the visual Frontend to the complex Backend.
Imagine a large hotel.
If a guest wants a towel, they don't wander into the basement to find the laundry room. They call the Receptionist. The Receptionist knows where the laundry room is, gets the towel, and hands it to the guest.
In deer-flow:
It aggregates everything—Models, Memory, Skills, and Files—into a simple set of web commands.
Let's imagine the user opens the Settings menu in the Frontend. They want to see:
The Frontend will make a request to the Gateway to get this data.
We build the Gateway using FastAPI, a modern Python framework. The core concept here is the Router.
Think of Routers as different "Department Desks" at the reception.
Instead of one giant file with 100 functions, we split them into organized folders.
Let's look at backend/src/gateway/app.py. This is the entry point. It sets up the server and connects the departments.
We create the application and give it a title.
# src/gateway/app.py
def create_app() -> FastAPI:
app = FastAPI(
title="DeerFlow API Gateway",
description="API Gateway for DeerFlow...",
version="0.1.0",
)
# ... code continues ...
return app
Explanation: This initializes the web server. It creates the "Building" where our receptionist works.
This is the most important part. We tell the application which specific APIs we want to expose to the world.
# Inside create_app() function
# 1. Plug in the Models Department
app.include_router(models.router)
# 2. Plug in the Memory Department
app.include_router(memory.router)
# 3. Plug in the Skills Department
app.include_router(skills.router)
# 4. Plug in the Artifacts (Files) Department
app.include_router(artifacts.router)
Explanation: app.include_router is like opening a service window. Now, if someone visits /api/models, the models.router handles it. If they visit /api/memory, the memory.router handles it.
How do we know if the server is alive? We add a simple "heartbeat" endpoint.
@app.get("/health", tags=["health"])
async def health_check() -> dict:
return {
"status": "healthy",
"service": "deer-flow-gateway"
}
Explanation: When a monitoring system (or the Frontend) pings /health, it gets a JSON response saying "I'm OK!"
Let's trace what happens when the Frontend asks for the list of available AI models.
The Gateway doesn't "guess" the configuration. It uses the AppConfig system we see in backend/src/config/app_config.py.
This system is the single source of truth. It reads config.yaml and environment variables (like API keys) and provides them to the Gateway securely.
The Gateway uses a helper function to load settings safely.
# src/config/app_config.py (Simplified)
def get_app_config() -> AppConfig:
global _app_config
# Singleton pattern: Load once, reuse everywhere
if _app_config is None:
_app_config = AppConfig.from_file()
return _app_config
Explanation: This ensures we don't re-read the file from the hard drive 100 times a second. We load it once into memory and serve it fast.
One specific thing the Gateway handles is the MCP (Model Context Protocol) configuration. As seen in backend/src/mcp/client.py, this allows us to connect external tools (like a Google Drive connector or a Slack connector).
# src/mcp/client.py (Simplified)
def build_servers_config(extensions_config):
# Get list of enabled servers from config
enabled_servers = extensions_config.get_enabled_mcp_servers()
# Format them for the client
results = {}
for name, config in enabled_servers.items():
results[name] = build_server_params(name, config)
return results
Explanation: The Gateway converts the raw MCP settings into a format that the LangGraph agents can actually use to connect to external tools.
With the Gateway in place, our architecture is complete.
Congratulations! You have toured the entire deer-flow system.
We moved from a simple "chat bubble" interface to a fully-fledged AI Operating System. By breaking the system down into these 7 distinct chapters, we ensured that every part has a single responsibility:
You are now ready to start building your own agents, adding new skills, or customizing the frontend to your liking.
Happy Coding!
Generated by Code IQ