๐ฆ Ollama Haskell
ollama-haskell
is an unofficial Haskell client for Ollama, inspired by ollama-python
. It enables interaction with locally running LLMs through the Ollama HTTP API โ directly from Haskell.
โจ Features
- ๐ฌ Chat with models
- โ๏ธ Text generation (with streaming)
- โ
Chat with structured messages and tools
- ๐ง Embeddings
- ๐งฐ Model management (list, pull, push, show, delete)
- ๐๏ธ In-memory conversation history
- โ๏ธ Configurable timeouts, retries, streaming handlers
โก Quick Example
{-# LANGUAGE OverloadedStrings #-}
module Main where
import Data.Ollama.Generate
import qualified Data.Text.IO as T
main :: IO ()
main = do
let ops =
defaultGenerateOps
{ modelName = "gemma3"
, prompt = "What is the meaning of life?"
}
eRes <- generate ops Nothing
case eRes of
Left err -> putStrLn $ "Something went wrong: " ++ show err
Right r -> do
putStr "LLM response: "
T.putStrLn (genResponse r)
๐ฆ Installation
Add to your .cabal
file:
build-depends:
base >=4.7 && <5,
ollama-haskell
Or use with stack
/nix-shell
.
๐ More Examples
See examples/OllamaExamples.hs
for:
- Chat with conversation memory
- Structured JSON output
- Embeddings
- Tool/function calling
- Multimodal input
- Streaming and non-streaming variants
๐ Prerequisite
Make sure you have Ollama installed and running locally. Run ollama pull llama3
to download a model.
๐งช Dev & Nix Support
Use Nix:
nix-shell
This will install stack
and Ollama.
๐จโ๐ป Author
Created and maintained by @tusharad. PRs and feedback are welcome!
๐ค Contributing
Have ideas or improvements? Feel free to open an issue or submit a PR!