Chapter 6 ยท CORE

Usage Analytics Dashboard

๐Ÿ“„ 06_usage_analytics_dashboard.md ๐Ÿท Core

Chapter 6: Usage Analytics Dashboard

Welcome to the final chapter of the cc-switch tutorial!

In the previous chapter, Unified MCP Management, we learned how to manage external tools for our AI. We now have a fully functional proxy that routes traffic, handles failures, adapts protocols, and manages tools.

But there is one scary thing about using AI APIs: The Bill.

Since you are paying per "token" (word), running a complex coding task might cost $0.01 or $10.00. Without visibility, you are flying blind.

In this chapter, we will build the Usage Analytics Dashboard. Think of this as the "Smart Electricity Meter" for your AI. Instead of waiting for a monthly invoice, you can see exactly how much power you are using in real-time.

The Problem: "The Black Box" Bill

When you use Claude or OpenAI directly, you often don't see the cost until you log into their website and check the billing section.

Without cc-switch, these questions are hard to answer. With our dashboard, the answers are one click away.

Key Concepts

To build this, we need three distinct layers:

1. The Logger (The Clerk)

Every time a request finishes, someone needs to write down what happened.

2. The Calculator (Pricing Engine)

Raw tokens mean nothing without a price tag. The system needs to know that gpt-4o costs more than gpt-3.5-turbo. We multiply tokens * price_per_token to get the dollar amount.

3. The Visualizer (The Dashboard)

Rows of numbers are boring. We need to aggregate them (sum them up) and draw charts so we can spot trends instantly.


Usage: The Dashboard UI

From the user's perspective, this is just a beautiful screen in the application.

Let's look at the UsageDashboard component. It acts as a container that asks the backend for data and passes it to charts.

// src/components/usage/UsageDashboard.tsx

export function UsageDashboard() {
  // 1. State for filters (e.g., "Last 7 Days")
  const [timeRange, setTimeRange] = useState<TimeRange>("1d");

  // 2. The layout: Summary Cards, Charts, and Tables
  return (
    <div className="space-y-8 pb-8">
      {/* The big numbers at the top */}
      <UsageSummaryCards days={days} />

      {/* The line graph showing trends */}
      <UsageTrendChart days={days} />

      {/* Detailed tables for granular data */}
      <RequestLogTable />
    </div>
  );
}

Beginner Note: This React component is the "Director." It tells the specialized components (like UsageSummaryCards) to fetch and display their specific pieces of data.

Configuration: Setting Prices

To calculate costs accurately, the user can configure prices in the settings. This is handled by PricingConfigPanel.

// Inside UsageDashboard.tsx render
<AccordionItem value="pricing">
  <AccordionTrigger>
     Advanced Pricing Configuration
  </AccordionTrigger>
  <AccordionContent>
     {/* UI to set cost per 1M tokens */}
     <PricingConfigPanel /> 
  </AccordionContent>
</AccordionItem>

Internal Implementation: From Request to Chart

How does a chat message turn into a bar on a chart? Let's trace the data flow.

Sequence Diagram

sequenceDiagram participant AI as AI Model participant Proxy as Proxy Server participant DB as SQLite DB participant UI as Dashboard AI-->>Proxy: Response (Tokens: 100) Note right of Proxy: 1. Calculate Cost Proxy->>Proxy: 100 tokens * $0.00001 = $0.001 Note right of Proxy: 2. Log to DB Proxy->>DB: INSERT INTO logs (tokens, cost...) Note right of UI: 3. User opens Dashboard UI->>DB: SELECT SUM(cost) FROM logs DB-->>UI: Total: $5.00 UI->>UI: Draw Chart

Step 1: The Database Table

First, we need a place to store the history. We defined this table back in Chapter 4: SQLite Persistence & Schema.

-- Inside the database schema
CREATE TABLE request_logs (
    id TEXT PRIMARY KEY,
    provider_name TEXT,
    model TEXT,
    prompt_tokens INTEGER,
    completion_tokens INTEGER,
    total_cost REAL,      -- The calculated money spent
    timestamp INTEGER     -- When it happened
);

Step 2: Logging the Request (logger.rs)

When a request finishes in our Local Proxy Gateway, we call a function to save the details.

// src-tauri/src/usage/logger.rs

pub async fn log_request(
    db: &Database,
    usage_data: UsageData
) -> Result<(), AppError> {
    // We simply insert the data into the SQLite table
    db.conn.execute(
        "INSERT INTO request_logs 
        (model, prompt_tokens, completion_tokens, total_cost, timestamp)
        VALUES (?1, ?2, ?3, ?4, ?5)",
        (
            usage_data.model,
            usage_data.prompt_tokens,
            usage_data.completion_tokens,
            usage_data.cost,
            current_time()
        ),
    )?;
    Ok(())
}

Step 3: Aggregating Data for the UI

When the React dashboard asks for "Last 7 Days," we don't send thousands of raw logs. We ask the database to do the math for us. This is much faster.

Here is how the Rust backend calculates the summary:

// src-tauri/src/usage/queries.rs

pub fn get_daily_usage(db: &Database, days: i64) -> Result<Vec<DailyStat>> {
    // SQL Magic: Group by date and sum the costs
    let mut stmt = db.conn.prepare(
        "SELECT 
            date(timestamp, 'unixepoch') as day, 
            SUM(total_cost) as daily_cost
         FROM request_logs
         WHERE timestamp > ?1
         GROUP BY day"
    )?;
    
    // ... Convert rows to Rust objects and return
}

Beginner Note: The GROUP BY command is powerful. It takes all logs from "2023-10-27" and smashes them into one single row with the total cost. This makes drawing charts very easy.

Connecting the Pieces

Now, let's look at how the frontend requests this data. We use a library called TanStack Query (referenced as useQueryClient in the code).

// Inside a React Component
const { data } = useQuery({
  queryKey: ["usage", "summary", "7d"],
  queryFn: async () => {
    // Call the Rust backend command we defined above
    return await invoke("get_daily_usage", { days: 7 });
  }
});

// usage: data.daily_cost (e.g., 5.42)

The UsageTrendChart component then takes this data and uses a charting library to draw the line graph you see on screen.

Project Conclusion

Congratulations! You have reached the end of the cc-switch tutorial.

Let's look back at what we have built:

  1. Local Proxy Gateway: We intercepted traffic from CLI tools.
  2. Intelligent Routing & Failover: We made the connection unbreakable by auto-switching providers.
  3. Provider Adaptation Layer: We translated different API languages so any tool works with any model.
  4. SQLite Persistence & Schema: We gave our app a permanent, safe memory.
  5. Unified MCP Management: We centralized the configuration of external tools.
  6. Usage Analytics Dashboard: We visualized the data to gain insights and control costs.

You now have a professional-grade AI infrastructure running right on your desktop. You have moved from being a passive consumer of APIs to an active manager of your AI resources.

Thank you for following along! Go forth and code with superpowers.


Generated by Code IQ