Writing Better Code with Ollama

Matt Williams
26 Jan 202404:43

TLDRThe video discusses the recent release of an official Node.js library for Ollama, which can be found on GitHub. The speaker shares their experience with the library, highlighting the benefits of streaming responses for a faster user experience. They also introduce Llama Coder, an AI-powered code assistant that can autocomplete code and generate comments based on user prompts. Additionally, the video explores the use of Continue.dev, a tool that allows developers to ask coding questions and receive model-based answers without an internet connection. The speaker recommends both Llama Coder and Continue.dev as free alternatives to Copilot, especially useful for those working offline. The video concludes with a brief guide on setting up these tools, suggesting the use of specific models for optimal performance and mentioning the option to disable telemetry for privacy.

Takeaways

  • πŸš€ The Ollama team has released an official Node.js library, which can be found on GitHub.
  • πŸ€– The speaker plans to start building with the Node.js library and its Python equivalent soon.
  • πŸ’¬ Ollama can be instantiated and used to call the chat endpoint with a system prompt and initial question.
  • ⏱ By default, the endpoint does not stream, providing a JSON response with performance metrics.
  • πŸ”„ Setting the stream to true changes the function to return an async generator instead of a JSON blob.
  • πŸ“ Llama Coder, an alternative to Copilot, can autocomplete code and even add comments based on user description.
  • πŸ“œ To extract specific content like tokens from JSON blobs, one can modify the console.log statement.
  • πŸ’¬ The speaker discusses the need for a conversation with the code and an expert to refine the process.
  • πŸ” Continue.dev is a tool that can help with coding questions, such as finding alternatives to console.log.
  • πŸ› οΈ Both Llama Coder and Continue are VS Code extensions that provide free assistance, even without an internet connection.
  • πŸŒ‰ The speaker highlights the utility of these tools for coding during a ferry ride without cell service.
  • πŸ“‹ To set up Ollama, one needs to install it, configure the Llama Coder extension, and optionally adjust settings like disabling telemetry.

Q & A

  • What is the recent development by the Ollama team?

    -The Ollama team has released an official Node.js library, which can be found on GitHub.

  • What is the default behavior of the chat endpoint in the Ollama library?

    -The chat endpoint defaults to not streaming, meaning it returns a JSON blob with the output and performance metrics once the response is complete.

  • What happens if you set the stream to true in the Ollama library?

    -Setting the stream to true changes the function to return an async generator instead of a JSON blob, which allows you to start seeing content sooner.

  • Which tool is mentioned for code assistance that provides autocompletion and code generation?

    -Llama Coder is mentioned as a tool that provides autocompletion and writes code based on the developer's comments.

  • What is the alternative to console.log suggested for printing without adding a newline?

    -The alternative suggested is process.stdout.write, which allows printing without adding a newline.

  • What are the two VS Code extensions mentioned for an alternative to Copilot?

    -The two VS Code extensions mentioned are Llama Coder and Continue.

  • What is the advantage of using Llama Coder and Continue when there is no internet connection?

    -These extensions work offline, providing code assistance even when there is no internet connection, which is useful for developers in areas with limited connectivity.

  • What is the process to set up the Llama Coder extension in VS Code?

    -To set up Llama Coder, you need to install it and then configure the model settings. The user is using the model deepseek coder 1.3b q4, but it's suggested to experiment with different models to find the best fit.

  • What is the role of Continue.dev in the development process?

    -Continue.dev is used for asking questions about code and receiving answers. It can also suggest alternatives to certain coding practices, like using process.stdout.write instead of console.log.

  • How can one disable telemetry in Continue to prevent it from using the internet?

    -You can disable telemetry by reviewing the Continue documentation and adjusting the appropriate settings to prevent any internet usage.

  • What are some other VS Code extensions that are mentioned as alternatives to Copilot?

    -Other VS Code extensions mentioned are Code GPT and Ollama Autocoder.

  • What is the speaker's location and how does it affect their internet connectivity?

    -The speaker lives on an island near Seattle and has to take a ferry to get to the mainland. There are no cell towers in the middle of Puget Sound, which means they often experience periods without internet connection.

Outlines

00:00

πŸš€ Introduction to Ollama Node.js Library

The speaker introduces a new Node.js library for Ollama, available on GitHub, and expresses their intention to start using it along with its Python counterpart. They demonstrate how to use the library by importing it, instantiating it, and calling the chat endpoint with a system prompt and an initial question. The speaker then discusses the default behavior of the endpoint, which is to return a JSON blob with the output and performance metrics. They also touch upon the benefits of enabling streaming for faster perceived response times and the need to handle the async generator that results from setting the stream to true.

Mindmap

Keywords

Ollama

Ollama is a technology or tool mentioned in the video that has an official Node.js library. It is used for coding assistance and is central to the video's theme of improving code writing. The script discusses using Ollama to instantiate and call chat endpoints, indicating its role in facilitating programming tasks.

Node.js

Node.js is a JavaScript runtime built on Chrome's V8 JavaScript engine, which allows for server-side JavaScript execution. In the script, it is mentioned as the platform for which the Ollama team has released an official library, signifying its relevance in the context of backend development and the use of Ollama.

GitHub

GitHub is a web-based platform for version control and collaboration that allows developers to work on projects from anywhere. The video script refers to finding the Ollama Node.js library on GitHub, highlighting its role as a repository for code and a community hub for developers.

Streaming

In the context of the video, streaming refers to the process of receiving data in a continuous flow rather than waiting for the entire response. The script discusses setting the stream to true for the Ollama function, which changes the output from a JSON blob to an async generator, affecting how the response is handled and perceived in terms of speed.

Async Generator

An async generator is a type of iterator in JavaScript that can asynchronously yield values. The script mentions that setting the stream to true in the Ollama function results in an async generator being returned, which requires a different handling method compared to a JSON blob.

Llama Coder

Llama Coder is an extension for Visual Studio Code (VS Code) that assists with coding by autocompleting code snippets. The video script describes using Llama Coder to write code based on comments, indicating its utility in streamlining the coding process and enhancing productivity.

Continue.dev

Continue.dev is a VS Code extension that allows users to ask questions about their code and receive answers. The script describes using Continue.dev to find alternatives to console.log, demonstrating its role in aiding developers with real-time coding inquiries and solutions.

process.stdout.write

process.stdout.write is a method in Node.js used for writing output to the console without adding a newline character. The video script mentions using process.stdout.write as an alternative to console.log for printing tokens without newlines, showcasing its use in controlling console output formatting.

VS Code Extensions

VS Code Extensions are add-ons that enhance the functionality of Visual Studio Code. The script discusses several extensions, including Llama Coder and Continue, that provide coding assistance and improve the development experience by offering features like autocompletion and real-time coding support.

Telemetry

Telemetry is the process of collecting and analyzing data that is often used for monitoring and improving software. In the context of the video, telemetry is mentioned in relation to disabling it in the Continue extension to prevent internet usage, which is relevant for users working offline.

Offline Coding

Offline coding refers to the ability to write and work on code without an internet connection. The video script highlights the importance of offline coding, especially for the speaker who lives on an island and occasionally has no internet access, emphasizing the value of tools that function well without a live internet connection.

Sourcegraph

Sourcegraph is a code search and intelligence tool that helps developers understand and navigate codebases. The video script mentions the speaker's interest in working with Cody from Sourcegraph, indicating the potential integration of Sourcegraph's functionalities with local coding tools for enhanced code understanding and navigation.

Highlights

The Ollama team has released an official Node.js library for Ollama, available on GitHub.

The speaker plans to start building with the Node.js library and its Python equivalent soon.

Importing and instantiating Ollama to call the chat endpoint with a system prompt and initial question.

The default chat endpoint does not stream, providing a JSON blob with output and performance metrics.

Enabling streaming can make the response feel faster by showing content sooner, but it requires a different handling approach.

Llama Coder, an alternative to Copilot, provides code suggestions and can write code based on comments.

Llama Coder can auto-complete code suggestions, improving the coding experience.

Continue.dev is a tool that allows users to ask questions about code and receive answers without a need for an internet connection.

Using process.stdout.write as an alternative to console.log to print without adding a newline.

Both Llama Coder and Continue are free extensions for VS Code that can work offline.

The speaker lives on an island and appreciates the ability to code without an internet connection during ferry rides.

Instructions on setting up Ollama, Llama Coder, and Continue are provided for users.

The importance of choosing the right model in Llama Coder for speed and accuracy is discussed.

Continue allows users to select different models and has settings to disable telemetry for privacy.

Other VS Code extensions like Code GPT and Ollama Autocoder are mentioned, but Llama Coder and Continue are preferred by the speaker.

The speaker expresses interest in trying out Cody from Sourcegraph and discusses the enthusiasm of its CEO.

The speaker invites viewers to share their local setup or configuration preferences in the comments.

A call to action for feedback on whether viewers have replaced Copilot with a local alternative is made.