How is THIS Coding Assistant FREE?

Software engineering
14 Dec 202305:19

TLDRIn this video, the creator explores Cody, a new AI coding assistant, comparing it to GitHub Copilot. Cody stands out with its free tier for individual developers and unique features like selecting different language models from various vendors and custom commands. The video also discusses the potential of running a local language model and the creator's personal preference for using Cody alongside GitHub Copilot for its distinctive capabilities.

Takeaways

  • 🤖 Cody is a new AI coding assistant designed to help with coding challenges.
  • 🆓 Cody offers a free tier for individual developers working on public and private code.
  • 📋 Cody's integration is available in various editors, including VS Code and Neovim, with more coming soon.
  • 🔍 Cody's primary function is code autocomplete, which can complete lines or entire functions.
  • 🤖 Unique to Cody is the ability to select different LLM (Large Language Models) from various vendors.
  • 💬 Cody features custom commands and a chat interface for user interaction.
  • 🔧 Custom commands in Cody can be scoped to work on the entire codebase or just a selection.
  • 🔄 Cody indexes the entire codebase for commands, which took about 6 minutes for a large repository.
  • 🔄 The free tier of Cody has a rate limit, which can be mitigated by using a personal LLM key.
  • 🌐 The video is sponsored by Sourcegraph, but the creator had complete freedom to form their own conclusions.
  • 📝 The creator is considering using both Cody and GitHub Copilot for different use cases due to their unique features.

Q & A

  • What is Cody and how does it differ from GitHub Copilot?

    -Cody is a new AI coding assistant that offers features like selecting different LLM (Large Language Models) from various vendors and custom commands, which are not available in GitHub Copilot.

  • What are the unique features of Cody that set it apart from other coding assistants?

    -Cody's unique features include the ability to select different LLM models from various vendors and the implementation of custom commands, which allow for more personalized interactions and functionality.

  • Is there a free tier for individual developers using Cody?

    -Yes, Cody offers a free tier for individual developers to use on public and private code, which is a significant advantage for those who do not wish to pay for coding assistance.

  • Which programming environments is Cody compatible with?

    -Cody is compatible with various programming environments, including Visual Studio Code, and support for Neovim and Emacs is coming soon.

  • How does Cody handle code completion?

    -Cody's autocomplete feature can complete lines of code or entire functions, similar to other coding assistants, but with the added ability to customize the LLM used for this feature.

  • What is the purpose of custom commands in Cody?

    -Custom commands in Cody allow users to create specific actions with a given name and scope, which can be triggered through the chat interface, providing a more tailored coding experience.

  • How does Cody address the entire code base rather than just the current file?

    -Cody requires an initial indexing process to become aware of the entire code base, after which it can address issues or perform commands across the entire repository, not just the current file.

  • What are some potential improvements or additions the user would like to see in Cody?

    -The user wishes for local LLM integration, which would allow Cody to run a local LLM and potentially solve rate limit issues, as well as the ability to use personal API keys for LLMs like OpenAI's.

  • How does the user plan to use Cody alongside GitHub Copilot?

    -The user intends to use both Cody and GitHub Copilot for different use cases, leveraging Cody's unique features like custom LLM selection and custom commands.

  • What is the user's opinion on the free tier and its rate limit?

    -The user finds the free tier appealing but acknowledges the rate limit, which requires waiting before the service becomes available again after being hit.

Outlines

00:00

🤖 Exploring Cody: The New AI Coding Assistant

The video script introduces Cody, a new AI coding assistant, and discusses its features compared to GitHub Copilot. The author plans to test Cody on personal projects and highlights its free tier for individual developers. Sponsored by Sourcegraph, the video promises an unbiased review. Cody's unique features include the ability to select different LLM models from various vendors and custom commands, which are not available in Copilot. The author also mentions the potential for local LLM integration and the free tier's rate limit as areas for improvement.

05:01

🎉 Conclusion and Sponsor Acknowledgement

The video concludes with a brief mention of the sponsor, Sourcegraph, and a thank you note. The author signs off with a hint of music and applause, indicating the end of the video.

Mindmap

Keywords

💡Cody

Cody is an AI coding assistant mentioned in the video. It is designed to help developers write code more efficiently by providing autocomplete suggestions and other features. In the context of the video, Cody is being compared to GitHub Copilot, another AI coding assistant, with a focus on its unique offerings such as the ability to select different language models and custom commands.

💡Large Language Models (LLMs)

LLMs are AI models trained on vast amounts of text data, enabling them to generate human-like text. In the video, LLMs are the backbone of AI coding assistants like Cody and GitHub Copilot. They are used to predict and complete code lines or entire functions, making the coding process faster and more efficient for developers.

💡Autocomplete

Autocomplete is a feature in coding assistants like Cody that predicts and suggests the next part of a code line as the developer types. This functionality helps to speed up the coding process by reducing the amount of typing required and can also assist in adhering to coding standards and best practices.

💡Custom Commands

Custom Commands in Cody allow users to create specific actions that can be triggered with a simple command. These commands can be tailored to perform various tasks within the code, such as formatting or refactoring, and can be scoped to work on the entire codebase or just a selection. This feature provides a level of personalization and efficiency that is not typically found in other coding assistants.

💡Sourcegraph

Sourcegraph is a company that sponsored the video. It is implied that they provide tools or services related to code management and navigation, which could complement the functionality of AI coding assistants like Cody. The sponsorship indicates a relationship between the company and the content creator, but the video's creator maintains editorial independence.

💡GitHub Copilot

GitHub Copilot is an AI coding assistant developed by GitHub and OpenAI. It uses the GPT-3 model to assist developers with coding tasks. In the video, the creator compares Cody's features with those of GitHub Copilot, highlighting the differences and potential advantages of using Cody, especially for those who are not paying for Copilot's services.

💡Free Tier

The Free Tier refers to the basic version of Cody that is available at no cost to individual developers. This tier allows users to access the AI coding assistant's features for both public and private code, making it an attractive option for those who are cost-conscious or contributing to open-source projects.

💡Code Smell

Code Smell is a term used in software development to describe code that looks suspicious or indicates a potential problem. In the video, Cody is shown to have a feature that can identify 'stinky' code, which is a metaphor for detecting code that may need refactoring or improvement. This feature helps developers maintain code quality.

💡Local LLMs

Local LLMs refer to language models that are hosted and run on a user's local machine or network, as opposed to being accessed through a cloud service. The video's creator expresses a desire for Cody to support local LLMs, which would allow for faster response times and potentially bypass rate limits imposed on the free tier of the service.

💡Rate Limit

A Rate Limit is a restriction placed on how often a service can be used within a certain period. In the context of the video, Cody's free tier has a rate limit, meaning that after a certain number of requests, users would have to wait before they can continue using the service. The creator suggests that integration with local LLMs could be a solution to this limitation.

Highlights

Cody is a new AI coding assistant.

Cody uses large language models (LLMs) trained on coding challenges.

The video is sponsored by Sourcegraph, but the reviewer has complete freedom to make their own conclusions.

Cody has a free tier for individual developers for public and private code.

Cody works in various editors, including Visual Studio Code, Neovim, and Emacs.

Cody's autocomplete feature can complete lines of code or entire functions.

Cody allows selecting different LLM models from different vendors.

Cody offers custom commands, a feature not present in GitHub Copilot.

Custom commands can be scoped to work on the entire codebase or just a selection.

Cody is aware of the entire codebase, not just the current file.

Cody's free tier has a rate limit, but using your own LLM key can potentially eliminate this limit.

The reviewer has been a GitHub Copilot user for over a year and has high expectations for Cody.

The reviewer plans to use both Cody and GitHub Copilot for different use cases.

The reviewer wishes for a local LLM integration to improve Cody's functionality.

Cody's unique features include the ability to switch LLMs and the custom commands feature.

The reviewer has a short wish list for additional features to enhance Cody's capabilities.