Prompt LLMs directly from your code editor using Ollama integrations

Self Hosting LLMs using Ollama

Ollama provides an interface to self-host and interact with open-source LLMs (Large Language Models) using its binary or container image. Managing LLMs using Ollama is like managing container lifecycle using container engines like docker or podman. Ollama commands pull and run are used to download and execute LLMs respectively, just like the ones used to manage containers with podman or docker. Tags like 13b-python and 7b-code are used to manage different variations of an LLM....

January 12, 2024 · 7 min · Avnish
My Homelab

Building Your Own Homelab

There is an app for everything and modern app stores have made it extremely convenient to install them on your device. However, some underlying issues need to be discussed Each application has a different set of terms and services. A small subset of users read it and a smaller subset of them will refuse to use the application if they disagree with it. These applications are critical for your daily life but if you lose access to them, there might not be a proper support channel to regain access or retrieve data....

March 27, 2023 · 12 min · Avnish