In the previous chapter, Agents & Rules (Skills), we gave our AI a personality. Before that, in UI & Window Management, we gave it a face.
Now, we need to give it Action Buttons.
Imagine you have a high-tech television (the AI). You have electricity (Global State) and a screen (UI). But how do you actually tell the TV to do something specific?
You need a remote control with specific buttons:
In 99, Operations (Ops) are these buttons.
While the "brain" (AI) is the same for all of them, the workflow is different. "Search" creates a list of files, while "Visual" edits text. The Op defines this workflow.
An Op is a manager. It doesn't do the heavy lifting itself. Instead, it coordinates the other departments:
This is the main difference between Ops.
search, the Handler parses the answer and puts it in a Quickfix list (a clickable list of files).visual refactor, the Handler takes the answer and overwrites the text in my buffer.
Let's build the workflow for the search command.
The Goal:
How does the code wire all these systems together? Let's trace the lifecycle of a Search Operation.
In lua/99/init.lua, we expose the button to the user. This function prepares the Context (environment data) and opens the input window.
-- lua/99/init.lua
function _99.search(opts)
-- 1. Create context (Who acts? The 'search' op)
local context = get_context("search")
-- 2. Open the UI asking for input
-- When the user presses Enter, run 'ops.search'
capture_prompt(ops.search, "Search", context, opts)
end
Explanation: We aren't searching yet. We are just setting up the stage and opening the microphone (UI).
Once the user types their query, ops.search is called. This lives in lua/99/ops/search.lua. This is where the magic happens.
-- lua/99/ops/search.lua
local function search(context, opts)
-- 1. Prepare the Request object
local request = Request.new(context)
-- 2. Create the Prompt (Combine user text + code context)
local prompt, refs = make_prompt(context, opts)
request:add_prompt_content(prompt)
-- 3. Define what happens when we finish (The Clean Up)
local clean_up = make_clean_up(function() request:cancel() end)
-- 4. Launch the request!
request:start(make_observer(clean_up, function(status, response)
if status == "success" then
-- HANDLER: Do something with the result
create_search_locations(context._99, response)
end
end))
end
Explanation:
Request (like an empty envelope).Prompt (the letter).clean_up routine (to turn off the loading spinner).start the request and attach an Observer. The Observer waits for the AI to finish.
The AI returns a big string of text. For a search, we expect the AI to return lines looking like: filename:line:description.
We need to parse this string and feed it into Neovim's "Quickfix List" (a built-in feature for listing errors or search results).
-- lua/99/ops/search.lua
local function create_search_locations(_99, response)
local qf_list = {}
-- Split the AI's answer line by line
for _, line in ipairs(vim.split(response, "\n")) do
-- Helper to parse "file.lua:10:found it"
local res = parse_line(line)
if res then
table.insert(qf_list, res)
end
end
-- Tell Neovim to show this list
vim.fn.setqflist(qf_list, "r")
vim.cmd("copen") -- Open the bottom window
end
Explanation: This function converts the "smart" AI response into a "dumb" list that Neovim understands. copen is the Vim command to open that list window at the bottom of the screen.
To understand how Ops differ, let's look briefly at how visual would look different (conceptually).
Ideally, visual looks almost the same, except for the Handler:
-- Conceptual Visual Handler
if status == "success" then
-- Instead of a list, we replace text in the buffer
local range = context.range
vim.api.nvim_buf_set_text(
range.buffer,
range.start_row,
range.end_row,
response_text
)
end
The Op abstraction allows us to reuse the Prompting, UI, and Networking logic, changing only the final action.
We have built the Buttons for our remote control.
search) or a code edit (visual).However, we have glossed over a massive part of the system: The Request. How does the plugin actually talk to the AI? How does it handle streaming? How does it handle errors?
Next Chapter: The Request Lifecycle
Generated by Code IQ