ollama

cover image

Learn about the important Ollama commands to run Ollama on your local machine with Smollm2 and Qwen 2.5 models

cover image

Learn what Ollama is, its features and how to run it on your local machine with DeepSeek R1 and Smollm2 models

cover image

Learn how to use Ollama APIs like generate, chat and more like list model, pull model, etc with cURL and Jq with useful examples

cover image

Learn how to use Ollama and Open WebUI inside Docker with Docker compose to run any open LLM and create your own mini ChatGPT.