Sourcegraph Cody and GitHub Copilot are AI code assistants optimized for different enterprise needs. Cody excels in three key scenarios: (1) enterprises working with large, distributed codebases across multiple code hosts, (2) organizations requiring precise, context-aware code generation and responses, and (3) teams needing deeper, custom SDLC integrations from their code AI tooling.
Cody's key advantage is its ability to understand and utilize context from complete codebases, regardless of where code is hosted. This context enables Cody to generate higher-quality code and provide more accurate responses via chat and the Prompt Library (which allows teams to create customizable, shareable prompts to automate their work). Additionally, Cody offers unmatched flexibility in AI infrastructure - organizations can self-host models, use their own model keys, or establish secure connections to LLMs through cloud providers like Amazon Bedrock and Azure OpenAI.
GitHub Copilot is optimized for organizations seeking basic autocomplete and chat functionality for small codebases hosted exclusively on GitHub Enterprise Cloud. While it integrates well with GitHub's ecosystem, its context is limited to single repositories and GitHub-hosted code, and organizations cannot adopt standardized prompts or control their model infrastructure.
TL;DR: Choose Cody for comprehensive codebase understanding across multiple hosts, higher-quality code generation through better context, and flexible model hosting. Choose Copilot for basic code autocomplete and chat for small codebases hosted in GitHub Enterprise Cloud.
Features | Sourcegraph Cody | GitHub Copilot |
---|---|---|
Autocomplete | ||
Chat | ||
Custom and shareable prompts | ||
Inline edits |
IDE support | Sourcegraph Cody | GitHub Copilot |
---|---|---|
Visual Studio Code | ||
JetBrains | ||
Visual Studio | ||
Eclipse | ||
Web browser | ||
CLI | ||
API |
LLM / Model | Sourcegraph Cody | GitHub Copilot |
---|---|---|
Chat model (default) | Claude 3.5 Sonnet | GPT-4o |
Choose your LLM | ||
Bring your own LLM key | ||
Self-hosted LLM support | ||
Support for Amazon Bedrock, Azure OpenAI, and Google Cloud Vertex AI |
Context used for response personalization | Sourcegraph Cody | GitHub Copilot |
---|---|---|
Local code context | ||
Multi-repository context | ||
Support for remote context from GitLab and Bitbucket repositories | ||
Non-code context |
Deployment options | Sourcegraph Cody | GitHub Copilot |
---|---|---|
Self-hosted | ||
Cloud |
Pricing | Sourcegraph Cody | GitHub Copilot |
---|---|---|
Free tier offered | ||
Pro tier pricing for individuals | $9 / user / month | $10 / user / month |
Enterprise tier pricing | $19 / user / month | $39 / user / month |
Last updated: 2024-04-16
Cody makes it easy to write, fix, and maintain code.