model

package
v0.3.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 17, 2025 License: Apache-2.0 Imports: 3 Imported by: 3

Documentation

Overview

Package model defines the interfaces and data structures for interacting with LLMs.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type LLM

type LLM interface {
	Name() string
	GenerateContent(ctx context.Context, req *LLMRequest, stream bool) iter.Seq2[*LLMResponse, error]
}

LLM provides the access to the underlying LLM.

type LLMRequest

type LLMRequest struct {
	Model    string
	Contents []*genai.Content
	Config   *genai.GenerateContentConfig

	Tools map[string]any `json:"-"`
}

LLMRequest is the raw LLM request.

type LLMResponse

type LLMResponse struct {
	Content           *genai.Content
	CitationMetadata  *genai.CitationMetadata
	GroundingMetadata *genai.GroundingMetadata
	UsageMetadata     *genai.GenerateContentResponseUsageMetadata
	CustomMetadata    map[string]any
	LogprobsResult    *genai.LogprobsResult
	// Partial indicates whether the content is part of a unfinished content stream.
	// Only used for streaming mode and when the content is plain text.
	Partial bool
	// Indicates whether the response from the model is complete.
	// Only used for streaming mode.
	TurnComplete bool
	// Flag indicating that LLM was interrupted when generating the content.
	// Usually it is due to user interruption during a bidi streaming.
	Interrupted  bool
	ErrorCode    string
	ErrorMessage string
	FinishReason genai.FinishReason
	AvgLogprobs  float64
}

LLMResponse is the raw LLM response. It provides the first candidate response from the model if available.

Directories

Path Synopsis
Package gemini implements the model.LLM interface for Gemini models.
Package gemini implements the model.LLM interface for Gemini models.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL