Prompt LLMs directly from your code editor using Ollama integrations

Self Hosting LLMs using Ollama

Ollama provides an interface to self-host and interact with open-source LLMs (Large Language Models) using its binary or container image. Managing LLMs using Ollama is like managing container lifecycle using container engines like docker or podman. Ollama commands pull and run are used to download and execute LLMs respectively, just like the ones used to manage containers with podman or docker. Tags like 13b-python and 7b-code are used to manage different variations of an LLM....

January 12, 2024 · 7 min · Avnish
Screen Recording from my development workflow

My Development Environment: kitty, zsh, Neovim, tmux, and lazygit

Until now I’ve been using Visual Studio Code as my primary code editor because I try different Linux distributions on my laptop and VSCode is available by default in almost every application manager. When I open a new project, the VSCode suggests relevant extensions based on the tech stack. After hearing the praises for Neovim from Primeagen and TJ DeVries I decided to give it a go along with other command line utilities like tmux and lazygit to test if they optimize my development workflow....

December 21, 2023 · 6 min · Avnish