Below you will find pages that utilize the taxonomy term “Ai”
Running Hugging Face models on Ollama
As part of my continuing experiments with AI, I regularly use models from Hugging Face. Ollama makes it really easy to run models locally from their own repo, but sometimes they do not have exactly the model one wants.
Why Use Hugging Face
While Ollama has its own curated repository of models you can use, Hugging Face has kind of become the community default. The choice is much larger and opens up possibilities that are not available if you just stick with Ollama’s own repository. For mainstream models, there maybe quantization that are not available in Ollama’s own repo. There are plenty of fine-tunes of more mainstream models that are only on Hugging Face.
N8n with Ollama on Kubernetes
Once again I have a new tool I have been playing with, once again AI related. One of the tools I have been using is n8n which is a workflow automation platform. It enables us to integrate multiple applications and services through a visual interface. While it is very much an enterprise solution, it is FLOSS and we can deploy it at home, albeit with some caveats and/or workarounds.
One of the really powerful parts of N8n is that we can integrate with various AI platforms, including all the usual suspects: Claude, ChatGPT, etc. Of course I want to keep things local, which N8n caters for with Ollama.
OpenClaw: My first experience with a self-hosted AI Assistant
Recently, I decided to jump on the latest AI bandwagon. While cloud-based large language models are powerful and convenient, I wanted complete control over my data, plus the satisfaction of running everything on my own infrastructure. Enter OpenClaw — a personal AI assistant that runs locally and integrates with messaging platforms you already use.
Info