Run Ollama on Windows - Step By Step installation of WSL2 and Ollama

Аватар автора
Systematic Shenanigans II: The Return of the Shenanigans
Ollama let us work with multiple LLMs locally. It is useful when we work with Multi Agent Framework like AutoGen, TaskWeaver or crewAI on Windows. To run it on Windows we can turn on Windows Subsystem for Linux (WSL2) feature and install the Linux version of Ollama on Windows. Chapters 01:01 - Turn on Virtual Machine Platform and Windows Subsystem for Linux 01:15 - How to install Linux on Windows with WSL2 01:54 - Update WSL1 to WSL2 02:44 - Download and install Ollama for Linux 03:28 - ollama run Mistral on Port 11434 04:00 - Run Ollama commands like /? for help and /bye to quit 04:27 - ollama run openhermes

0/0


0/0

0/0

0/0