Changelog

Latest updates to the Sourcegraph Code Intelligence Platform!

Cody v1.38: Improved chat performance

Beatrix WooBeatrix Woo

Cody v1.38 brings significant improvements to chat performance and updates to the autocomplete provider.

Improved context fetching from @-mentioned repos for Cody Enterprise Cloud

Kelvin YapKelvin Yap

In the latest Sourcegraph release for Enterprise Cloud customers we've made improvements to the context fetched from @-mentioned repos, providing greater precision and increasing the likelihood Cody returns higher quality responses.

Removal of multiple job pods for Kubernetes Executors

Kelvin YapKelvin Yap

Single job pods for native Kubernetes Executors have been the default method for the past couple of releases, and based off positive feedback we are now removing multiple job pod configurations as a configuration option.

Autocomplete improvements for DeepSeek-V2

Kelvin YapKelvin Yap

DeepSeek-V2 was recently introduced as the default autocomplete model for Cody Enterprise customers, and we have implemented optimizations to prompt caching and direct routing that have resulted in improved latency and quality for both single-line and multi-line completions.

Execute terminal commands with Smart Apply

Kelvin YapKelvin Yap

Smart Apply now supports the executing of commands in the terminal, allowing users to execute a suggestion Cody provides with the simple click of a button.

Join the waitlist to use OpenAI o1 with Cody

Alex IskenAlex Isken

OpenAI just unveiled its latest models, 01-preview and o1-mini, which offer enhanced reasoning capabilities for tackling complex tasks. Both models are now available in Cody as a limited release.

Sunsetting the experimental Cody Ignore feature

Alex IskenAlex Isken

We are sunsetting Cody Ignore in this release. Cody Ignore was an experimental feature that was off by default for all users, only enabled via the experimental feature setting in the Cody extension. It lets users specify files or folders in a .cody/ignore file which were then ignored by Cody and disregarded from context.

New default models for enterprise: DeepSeek-V2 and Claude 3.5 Sonnet

Kelvin YapKelvin Yap

Cody is built on model interoperability and we aim to provide access to the best and latest models, and today we’re making an update to the default models offered to Enterprise customers: DeepSeek-V2 is the recommended default for autocomplete, while Claude 3.5 Sonnet is now the recommended default for chat and prompts.

Cody Web is now generally available

Kelvin YapKelvin Yap

The core experience of using Cody is alongside your code in your IDE of choice, but there are often times when you want to interact with Cody via the web. It can be particularly helpful as part of a workflow where you’re performing a search via Code Search and need help from Cody, or if you’re on your mobile device and want to ask it a question. We’re happy to announce that Cody Web is now generally available and includes numerous improvements to make the web experience better and more consistent.

Smart Apply available in JetBrains

Kelvin YapKelvin Yap

Smart Apply is now available for JetBrains, allowing users to take suggestions from the Cody chat window and near-instantly turning them into diffs in your code.

New JetBrains side panel

Kelvin YapKelvin Yap

In our quest for consistency across how you use Cody, JetBrains users now get the same side panel experience found in Cody for VS Code and on the web. Chat lives in the side panel, there are now dedicated tabs for easier navigation, and prompts are now easier to discover and use.

Search Jobs now generally available

Kelvin YapKelvin Yap

Search Jobs are now generally available, allowing for search queries to be run across your entire codebase where completeness is prioritized over quick response times and results ranking.

@-mention directories for Cody Web and Enterprise

Kelvin YapKelvin Yap

Alongside files and repos Cody now lets developers @-mention directories as context, making it easier for users working with larger, more complex repos like monorepos to ensure they're including the best context with their prompts.

Code Monitor Webhooks generally available

Kelvin YapKelvin Yap

Webhooks for Code Monitors are now generally available, alerting users whenever a new search result or change appears via a webhook receiver.

Accept and reject block-level diffs for granular control

Alex IskenAlex Isken

You can now control which parts of a multi-line edit to accept with more granularity. When you ask Cody to edit a block of code, the presented diff will be split into discrete code blocks, and you can accept or reject each diff separately.

Chat now uses the full sidebar width in VS Code

Alex IskenAlex Isken

You can now control which parts of a multi-line edit to accept with more granularity. When you ask Cody to edit a block of code, the presented diff will be split into discrete code blocks, and you can accept or reject each diff separately.

Ask Cody to Fix, now on Windows in JetBrains

Alex IskenAlex Isken

You can now control which parts of a multi-line edit to accept with more granularity. When you ask Cody to edit a block of code, the presented diff will be split into discrete code blocks, and you can accept or reject each diff separately.

Dynamically insert code from Cody chat into your files with Smart Apply

Alex IskenAlex Isken

Chat-Oriented Programming (CHOP) allows users to interact with AI to solve problems and write code directly through chat. The new Smart Apply feature enables quick conversion of AI suggestions into code diffs. By pressing Apply, Cody intelligently inserts suggested code directly into code files, streamlining the process from chat to implementation.

Faster, more accurate autocomplete with VS Code

Alex IskenAlex Isken

Autocomplete has been improved with faster performance and greater accuracy, including a 350-millisecond reduction in latency and a 4% increase in completion acceptance rate. The update features the new DeepSeek-V2 model and is available to Free and Pro users, with full rollout to Enterprise users coming soon.