AICHE +
C
Cody Integration

Voice commands for enterprise AI coding

Command Cody with your voice. Codebase-aware completions spoken naturally.

Download AICHE
Works on:
macOSWindowsLinux

The short answer: open your IDE with Cody installed, open the Cody chat panel, press ⌃+⌥+R (Mac) or Ctrl+Alt+R (Windows/Linux), speak your codebase question for 30-60 seconds, and AICHE inserts the formatted prompt for Cody to analyze.

Cody Knows Your Entire Codebase. Do Your Prompts?

Sourcegraph Cody is different from other AI coding assistants. It indexes and understands your entire repository. It can search across thousands of files, follow import chains, and understand how your services connect. This codebase-wide context is Cody's biggest advantage.

But that advantage only matters if your prompts reference it. Asking "how do I add caching" gives Cody nothing to work with from your specific codebase. Asking "how should I add Redis caching to the UserService, given that we already use a cache-aside pattern in the OrderService and our cache invalidation strategy in the event bus publishes to a shared topic" gives Cody the context to search your repo, find the relevant patterns, and generate code that fits.

Typing prompts that reference your specific architecture, service names, and patterns takes 5-10 minutes. Speaking them takes 30-60 seconds, because you already know this stuff. You think about your codebase constantly. You just need a way to get that knowledge into the prompt without the typing bottleneck.

How to Set It Up

  1. Open your IDE (VS Code, JetBrains, or Neovim) with the Cody extension installed.
  2. Open the Cody chat panel in the sidebar.
  3. Press your AICHE hotkey (⌃+⌥+R on Mac, Ctrl+Alt+R on Windows/Linux) to start recording.
  4. Speak your complete question with codebase-specific context (example: "explain how the authentication flow works across our microservices, specifically how JWT tokens are validated in the API gateway, how they get passed to downstream services through request headers, and how token refresh works when access tokens expire. Include the Redis caching layer we use for blacklisting revoked tokens").
  5. Press the hotkey again. AICHE transcribes, applies Content Organization for multi-part questions, and inserts the text.
  6. Press Enter and let Cody search your codebase and answer.

Asking Architecture-Level Questions

Cody excels at questions that span multiple files and services. These are the questions that are hardest to type because they involve naming specific files, services, patterns, and their interactions.

When you speak, you naturally reference things the way you think about them: "the payment processing pipeline" or "the way we handle retries in the message queue consumer." Cody takes those references and searches your codebase for matches. You do not need to remember exact file paths or function names. Speak the way you think about the system, and Cody maps it to code.

This is especially useful during onboarding or code review. You can dictate questions like "walk me through what happens when a user clicks checkout, starting from the React component through the API layer, the payment service, and the notification system. Show me the actual code path." That prompt typed would take 4 minutes. Speaking it takes 20 seconds.

Using Cody for Code Generation With Context

When generating new code, tell Cody what existing patterns to follow. Dictate: "create a new endpoint for bulk user import that follows the same pattern as the existing bulk order import in the order-service, including the same validation approach, the same error accumulation strategy where we collect all errors and return them at the end instead of failing on the first one, and the same progress tracking using our WebSocket notification pattern."

Cody will search your repo, find those existing patterns, and generate new code that matches. The detail in your spoken prompt is what makes Cody's codebase awareness useful instead of theoretical.

Heads-up: mention specific service names, file types, or patterns you want Cody to focus on. Saying "in the auth-service" or "look at the Redis integration layer" helps Cody narrow its search across large repositories.

Pro tip: use voice to explain what you are trying to accomplish before asking how. Saying "I need to add rate limiting to prevent brute-force attacks on login, using our existing Redis setup" gives Cody better context than just "how do I add rate limiting."

Result: architectural questions that took 10 minutes to type with full service context now take 60 seconds to speak, and Cody provides better answers because your spoken prompt included the service names and patterns that make its codebase search useful.

Do this now: open Cody, press your hotkey, and ask one question about your codebase that spans multiple files or services. Reference specific parts of your architecture by name and see how Cody leverages that context.

#ai-coding#voice-commands#development