Running Hugging Face models on Ollama
As part of my continuing experiments with AI, I regularly use models from Hugging Face. Ollama makes it really easy to run models locally from their own repo, but sometimes they do not have exactly the model one wants.
Why Use Hugging Face
While Ollama has its own curated repository of models you can use, Hugging Face has kind of become the community default. The choice is much larger and opens up possibilities that are not available if you just stick with Ollama’s own repository. For mainstream models, there maybe quantization that are not available in Ollama’s own repo. There are plenty of fine-tunes of more mainstream models that are only on Hugging Face.
By Chris Cowley
read more

