Artificial Intelligence

Ollama AI on WSL runs just as well as native on Windows 11


If you’re looking at using Ollama on your PC to run local LLMs (Large Language Models), with Windows PCs at least, you have two options. The first is to just use the Windows app and run it natively. The second is to run the Linux version through WSL.

The first is definitely easier. For one, you don’t need to have WSL installed, and the actual process of getting up and running is simpler. You just downloaded the Windows installer, run it, and you’re up and running.

Leave a Reply

Your email address will not be published. Required fields are marked *