๐Ÿฆ™ Ollama Haskell

ollama-haskell is an unofficial Haskell client for Ollama, inspired by ollama-python. It enables interaction with locally running LLMs through the Ollama HTTP API โ€” directly from Haskell.


โœจ Features

  • ๐Ÿ’ฌ Chat with models
  • โœ๏ธ Text generation (with streaming)
  • โœ… Chat with structured messages and tools
  • ๐Ÿง  Embeddings
  • ๐Ÿงฐ Model management (list, pull, push, show, delete)
  • ๐Ÿ—ƒ๏ธ In-memory conversation history
  • โš™๏ธ Configurable timeouts, retries, streaming handlers

โšก Quick Example

{-# LANGUAGE OverloadedStrings #-}
module Main where

import Data.Ollama.Generate
import qualified Data.Text.IO as T

main :: IO ()
main = do
  let ops =
        defaultGenerateOps
          { modelName = "gemma3"
          , prompt = "What is the meaning of life?"
          }
  eRes <- generate ops Nothing
  case eRes of
    Left err -> putStrLn $ "Something went wrong: " ++ show err
    Right r -> do
      putStr "LLM response: "
      T.putStrLn (genResponse r)

๐Ÿ“ฆ Installation

Add to your .cabal file:

build-depends:
  base >=4.7 && <5,
  ollama-haskell

Or use with stack/nix-shell.


๐Ÿ“š More Examples

See examples/OllamaExamples.hs for:

  • Chat with conversation memory
  • Structured JSON output
  • Embeddings
  • Tool/function calling
  • Multimodal input
  • Streaming and non-streaming variants

๐Ÿ›  Prerequisite

Make sure you have Ollama installed and running locally. Run ollama pull llama3 to download a model.


๐Ÿงช Dev & Nix Support

Use Nix:

nix-shell

This will install stack and Ollama.


๐Ÿ‘จโ€๐Ÿ’ป Author

Created and maintained by @tusharad. PRs and feedback are welcome!


๐Ÿค Contributing

Have ideas or improvements? Feel free to open an issue or submit a PR!

Changes

Revision history for ollama-haskell

Unreleased

0.2.0.0 โ€“ 2025-06-05

  • Added stack matrix to ensure lib is buildable from lts-19.33
  • Made parameters & template fields optional in ShowModelResponse.
  • Added extra parameters fields in ModelInfo.
  • Added strict annotations for all fields.
  • Fixed ToJSON instance for delete model request body.
  • Removed duplicate code by using unified withOllamaRequest function for all API calls.
  • Added unified config type OllamaConfig to hold common configuration options.
  • Added validation for generate and chat functions to ensure required fields are present.
  • Added convience functions for generating Message and ToolCall types.
  • Added thinking field for chat and generate function.
  • Added ModelOptions type to encapsulate model options.
  • Added get ollama version function.
  • Added Common Manager, Callback functions and retry option in OllamaConfig.
  • Fixed tool_calls.
  • Added MonadIO versions of api functions.
  • Added more comprehensive error handling for API calls.
  • Added more comprehensive test cases for all functions.
  • Added schema builder for passing json format for structured output.

0.1.3.0 โ€“ 2025-03-25

  • Added options, tools and tool_calls fields in chat and generate.
  • Exported EmbeddingResponse.
  • Added Format argument in chat and generate function for structured output.

0.1.2.0 โ€“ 2024-11-20

  • Added hostUrl and responseTimeOut options in generate function.
  • Added hostUrl and responseTimeOut options in chat function.

0.1.1.3 โ€“ 2024-11-08

  • Increase response timeout to 15 minutes
  • Added encodeImage utility function that converts image filePath to base64 image data.
  • Added generateJson and chatJson. High level function to return response in Haskell type.

0.1.0.3 โ€“ 2024-11-05

  • Moving to stack instead of cabal.

0.1.0.2 โ€“ 2024-10-18

  • Increased response timeout time for chat function.

0.1.0.1 โ€“ 2024-10-18

  • Renaming Lib.hs to OllamaExamples.hs as it was conflicting Lib.hs name

0.1.0.0 โ€“ YYYY-mm-dd

  • First version. Released on an unsuspecting world.