AI-Powered Programming: Exploring Modern Coding Assistants in Visual Studio Code


In the ever-evolving field of software development, AI coding assistants have emerged as powerful tools to enhance productivity and code quality. Integrated into popular IDEs like Visual Studio Code, these intelligent assistants provide real-time code suggestions, debugging help, optimization tips or even live-chat conversations.

Leading the pack is GitHub Copilot, renowned for its contextual understanding and relevant code completions. However, there are several other noteworthy alternatives, including specialized tools like Codeium, Cody, Blackbox and Tabnine as well as self-hosted language models like Llama Coder with Ollama.

In this blog post, we will explore AI coding assistants, focusing on their integration with Visual Studio Code. We will introduce each tool, highlight their key features and evaluate their performance through a demo application. By the end, you'll have a clear understanding of how these AI assistants can transform your coding workflow and help you choose the best one for your needs.

GitHub Copilot

GitHub Copilot, developed by GitHub in collaboration with OpenAI, has quickly become one of the most prominent AI coding assistants available. Seamlessly integrated with Visual Studio Code, Copilot leverages the power of OpenAI's Codex model to provide intelligent code suggestions, complete code snippets, and even generate entire functions based on natural language descriptions.

Visual Studio Code Extension for GitHub Copilot

GitHub Copilot

What’s possible with GitHub Copilot:

  • Context-Aware Suggestions: Provides relevant code completions based on the surrounding code, function names, and comments.
  • Natural Language Processing: Translates plain English or German descriptions into functional code, based on the given context.
  • Multi-Language Support: Works with numerous programming languages and frameworks, from Python to JavaScript, Go and C++.
  • Learning and Adaptation: Adapts to your coding style over time, improving the accuracy of its suggestions based on your existing code.
  • Inline Documentation: Generates comments and documentation alongside code, enhancing code clarity and maintainability.

GitHub Copilot offers three plans: Individual ($10/month) for solo developers, students, and educators, featuring unlimited chat and real-time code suggestions; Business ($19/user/month) for organisations, adding user management, data exclusion from training, IP indemnity, and SAML SSO; and Enterprise ($39/user/month) for companies needing advanced customization, including fine-tuned models, pull request analysis, and enhanced management policies.


Just like GitHub Copilot, Codeium promises to be an advanced coding assistant with autocompletion, context awareness and chat - all for free. Codeium uses in-house trained models and offers free access to advanced code completion, search capabilities, and an interactive chat feature, making it an attractive alternative to Copilot. This service supports an extensive range of programming languages and seamlessly integrates with numerous IDEs.

Visual Studio Code Extension for Codeium


What’s possible with Codeium:

  • Context-Aware Completion: Similar to GitHub Copilot, Codeium offers context-aware suggestions for 70+programming languages.
  • Full Repo Context Awareness: Analyzes the entire codebase to understand structure, dependencies, and relationships, improving code suggestions and navigation.
  • Intelligent Search: Finds files and code within a project folder using prompts.
  • Versatile Deployment Options: Offers SaaS for cloud-based access, on-premises installation for internal control, and deployment within a VPC for heightened security within a public cloud environment.
  • SOC 2 Type 2 Certified: Underwent audit ensuring adherence to stringent security, availability, processing integrity, confidentiality, and privacy standards, assuring users of robust data protection.

Individual users can enjoy Codeium's features for free, including code autocomplete, an in-editor AI chat assistant, unlimited usage with encryption in transit and assistance via Discord. For teams of up to 200 seats, Codeium offers a plan at $12 per seat per month, providing additional benefits like admin usage dashboard, seat management, advanced personalization, GPT-4 support, and organisation-wide zero-day retention, with upcoming features including document searching.


Sourcegraph Cody is an open-source AI coding assistant that provides AI-generated autocomplete, answers coding questions, and writes and fixes code. It uses different models and pulls context from both local and remote repositories, and features custom commands and swappable LLMs.

Visual Studio Code Extension for Cody


What’s possible with Cody:

  • Autocomplete: Cody autocompletes single lines or entire functions across any programming language, configuration file, or documentation.
  • Chat: Enables conversational queries with Cody to get answers about programming topics or specific queries related to the codebase.
  • Built-In Commands: Improving development workflow with Cody's built-in commands for understanding, improving, fixing, documenting, and generating unit tests for code.
  • Custom Commands (Beta): Allows to tailor Cody to a specific workflow with custom commands defined as JSON within the repository.
  • Choose LLM: Cody Pro users can select from a range of large language models, including Claude and ChatGPT variants, to customize their AI assistant.
  • Cody Natural Language Search (Beta): Utilizes natural language queries to quickly find and open relevant files within your codebase.
  • Code Search: Powered by Sourcegraph's code search, Cody extends its capabilities by retrieving context from entire projects, enabling accurate answers code generation.

The Free plan is designed for light usage, offering 500 autocompletions and 20 messages and commands per month. It provides basic code context and personalization for small local codebases, with default LLM support across chat, commands, and autocomplete. Upgrading to the Pro plan, priced at $9 per user per month, is ideal for professional developers and small teams with unlimited autocompletion and messages, along with personalization tailored for larger local codebases. It also offers multiple LLM choices for chat and default LLMs for autocomplete. The Enterprise plan, at $19 per user per month provides advanced personalization for enterprise codebases, flexible LLM choices, and additional enterprise features like flexible deployment options and admin security features.


Blackbox AI is a comprehensive AI coding assistant with code suggestion, generation and code chat. The Visual Studio Code Extension further offers AI-generated commit messages and tracks local project changes. It focuses on code generation from natural language descriptions and includes a search function.

Visual Studio Code Extension for Blackbox


What’s possible with Blackbox:

  • Code Chat: Includes conversational coding queries with Blackbox AI Chat, providing instant answers to coding questions and enhancing collaboration.
  • Code Autocomplete: Boosts coding speed with Blackbox’s code autocomplete feature, available across 20+ programming languages, including Python, JavaScript, TypeScript, Go, and Ruby.
  • AI Commit: Simplifies the commit process with AI-generated commit messages, ensuring concise and informative descriptions for project changes.
  • Generate Code: Instantly generates code suggestions from a text prompt, enhancing productivity and reducing coding time.
  • DIFF View: Tracks project changes locally with BLACKBOX DIFF, leveraging git version control to monitor project evolution and enhance debugging and code review processes.
  • Code Search: Fetches relevant code context from across the entire codebase to provide accurate search results and find specific code snippets or solutions to coding problems.

Blackbox AI offers an onboarding call feature, potentially providing personalized support and guidance to users, as well as a monthly $9 Pro+ subscription for unlimited autocompletion and code chat.


Tabnine stands out as an AI coding assistant that was trained on high quality public code repositories and offers both cloud-based and on-premises solutions. It utilizes proprietary LLMs tailored to organisations. It supports various IDEs as well as a wide range of languages and frameworks.

Visual Studio Code Extension for Tabnine


What’s possible with Tabnine:

  • AI Code Generation: Excels in generating high-quality code snippets, autocompleting lines, and entire functions. It supports multiple programming languages.
  • AI Chat: Offers AI-powered chat functionality that supports various stages of the software development lifecycle, including code explanations, test generation, documentation, and bug fixes.
  • Personalization: Offers personalized coding assistance and swappable models tailored to coding style and patterns. It learns from the codebase and integrates into the development environment, providing context-aware suggestions that improve over time.
  • Privacy and Security: Ensures complete code privacy with zero data retention. This is crucial for enterprises and developers concerned about the security of their proprietary code.
  • Performance: The AI engine is designed to deliver real-time code suggestions without any noticeable lag, ensuring that developers can maintain a smooth and uninterrupted workflow.
  • Flexible Deployment: Depending on the subscription, Tabnine offers on-device or cloud-based AI assistance.

Tabnine offers three pricing plans to suit different user needs. The Basic plan is free and includes an on-device AI code completion model suitable for simple projects. The Pro plan, at $12 per user per month with a 90-day free trial, offers advanced features such as best-in-class AI models, full-function code completions, natural language code generation, and SaaS deployment. The Enterprise plan, priced at $39 per user per month with a 1-year commitment, includes all Pro features plus enhanced personalization, private code repository models, flexible deployment options, priority support, and comprehensive team management and reporting tools.

Llama Coder and Ollama

Llama Coder offers a robust, privacy-focused alternative to traditional cloud-based AI coding assistants like GitHub Copilot. Designed to run on-premise, this tool provides powerful AI capabilities directly on your machine, ensuring enhanced data privacy and low-latency performance, depending on the hardware.


Ollama is a platform that allows developers to run and experiment with advanced AI models, such as Codellama and Stable Code, locally on their hardware. This approach leverages the power of local computation to provide fast and secure AI-assisted coding.

What’s possible with Ollama:

  • Local AI Deployment: All AI computations are performed on your local machine, ensuring data privacy and eliminating the need for cloud-based processing. LLMs can be fine-tuned and customized to align with specific needs.
  • Multiple AI Models: Supports various models, including different parameter sizes and quantization levels, allowing management and customization based on hardware capabilities.
  • API: Ollama exposes a local API, allowing to integrate LLMs directly into applications and workflows.

Llama Coder Extension

Llama Coder is an extension for Visual Studio Code that integrates with Ollama to offer a self-hosted AI coding assistance. It provides intelligent code completions and suggestions, enhancing productivity and coding efficiency without compromising on privacy.

Visual Studio Code Extension for Llama


What’s possible with Llama Coder:

  • Code Autocomplete: Delivers real-time, accurate code completions.
  • Multi-Language Support: Compatible with various programming languages, making it versatile for different development environments.
  • Privacy-Focused: No telemetry or data tracking, ensuring that your coding activity remains private.
  • Model Flexibility: Allows to choose from any available coding-tuned model from Ollama. The model can be installed locally or on a remote endpoint.

Llama Coder and Ollama are entirely free to use, with no associated costs for accessing its features. The only limitation is the hardware requirement: it performs best on Apple Silicon Macs (M1/M2/M3) or systems equipped with an RTX 4090 GPU. Users must ensure they have adequate RAM, with a minimum of 16GB recommended, to leverage Llama Coder's capabilities effectively. This makes Llama Coder a cost-effective solution for developers who already possess or are willing to invest in suitable hardware.

A Small Comparison

In this test scenario, we're evaluating the coding assistants' performance and support in developing a Langchain Research Augmented Generation (RAG) application with the use of Ollama LLMs in Python. Our focus is on how these assistants aid in coding tasks, particularly integrating the LLM into the Langchain framework.

GitHub Copilot started strong by auto-completing the entire function right after I wrote the framework and function definition. Most suggestions appeared quickly and were concise. The inline prompt chat can even explain, fix, or document code chunks. However, due to a lack of knowledge about the specific frameworks I was using, the assistant often neglected important details such as required parameters and framework-specific configurations. This sometimes led to incomplete or incorrect implementations, requiring manual adjustments to get the code working correctly.

Codeium also reacted quickly to my typing, providing accurate suggestions. Because Codeium can access my local repository, it already knew important aspects of the LangChain framework and generated a running RAG application in no time. This ability to leverage context from existing files in the repository significantly improved the relevance and accuracy of the suggestions. However, even without similar existing files, Codeium quickly found a solution, though it encountered some hiccups. There were occasional issues with less common use cases and specific configurations, but overall, Codeium proved to be a reliable assistant, capable of quickly generating functional code with minimal intervention.

With Cody, chat responses were quick and efficient. After providing a rough description of the application through the chat interface, the assistant swiftly implemented the necessary code and even added a small input feature, enabling dynamic questions. Thanks to Cody's ability to scan my codebase, it found a solution right away. However, without this additional context, Cody wouldn't respond as quickly and struggled to find the optimal solution, even after providing various “hints”.

Blackbox took a different approach compared to other assistants. Initially, it attempted to create an entirely different application and struggled to complete its code generation. Its process was notably slower and had less user interaction compared to other assistants, particularly with code completion, and it frequently failed to finish generating code.

Tabnine proved to be highly useful in crafting the application, although it primarily completed smaller sections of code due to the limitations of the free plan. Notably, Tabnine stands out as the only assistant to offer a more sophisticated user interface, allowing users to switch between models and adjust further settings. Despite being on the basic version, I had the flexibility to adjust deployment modes. However, there were instances where the tool failed to respond, particularly when used in local mode.

The Llama Coder Extension was straightforward to set up. Within a menu, I could select a specific model, though it was necessary to ensure that the model was installed locally with Ollama. The assistant itself needed some time to generate suggestions and sometimes produced meaningful and concise output, while at other times, it failed to grasp the context of the surrounding code altogether.

In the end, this comparison doesn’t represent the true effectiveness of the tools, as it always depends on coding style and performance for a specific language, as well as the selected model. However, it provides an opportunity to share our experiences with these artificial pair programmers, offering insights into their strengths and weaknesses in different scenarios.


To wrap up our exploration of coding assistants, it's crucial to recognize that there's a multitude of other tools out there, each with its own distinct features and innovative concepts. The tools we've discussed offer a glimpse into the diversity of options available to developers, showcasing advanced capabilities such as contextual understanding, custom commands, privacy-focused deployment, and more. However, the landscape of coding assistance is constantly evolving, with new tools emerging and existing ones evolving to meet the ever-changing demands of developers.

Go Back explore our courses

We are here for you

You are interested in our courses or you simply have a question that needs answering? You can contact us at anytime! We will do our best to answer all your questions.

Contact us