Documentation
¶
Overview ¶
Package model defines the interfaces and data structures for interacting with LLMs.
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type LLM ¶
type LLM interface {
Name() string
GenerateContent(ctx context.Context, req *LLMRequest, stream bool) iter.Seq2[*LLMResponse, error]
}
LLM provides the access to the underlying LLM.
type LLMRequest ¶
type LLMRequest struct {
Model string
Contents []*genai.Content
Config *genai.GenerateContentConfig
Tools map[string]any `json:"-"`
}
LLMRequest is the raw LLM request.
type LLMResponse ¶
type LLMResponse struct {
Content *genai.Content
CitationMetadata *genai.CitationMetadata
GroundingMetadata *genai.GroundingMetadata
UsageMetadata *genai.GenerateContentResponseUsageMetadata
CustomMetadata map[string]any
LogprobsResult *genai.LogprobsResult
// Partial indicates whether the content is part of a unfinished content stream.
// Only used for streaming mode and when the content is plain text.
Partial bool
// Indicates whether the response from the model is complete.
// Only used for streaming mode.
TurnComplete bool
// Flag indicating that LLM was interrupted when generating the content.
// Usually it is due to user interruption during a bidi streaming.
Interrupted bool
ErrorCode string
ErrorMessage string
FinishReason genai.FinishReason
AvgLogprobs float64
}
LLMResponse is the raw LLM response. It provides the first candidate response from the model if available.
Click to show internal directories.
Click to hide internal directories.