Class: Ollama::Commands::Chat

Inherits:
Object
  • Object
show all
Includes:
DTO
Defined in:
lib/ollama/commands/chat.rb

Overview

A command class that represents the chat API endpoint for Ollama.

This class is used to interact with the Ollama API’s chat endpoint, which generates conversational responses using a specified model. It inherits from the base command structure and provides the necessary functionality to execute chat requests for interactive conversations with language models.

Examples:

Initiating a chat conversation

messages = [
  Ollama::Message.new(role: 'user', content: 'Hello, how are you?'),
  Ollama::Message.new(role: 'assistant', content: 'I am doing well, thank you!')
]
chat = ollama.chat(model: 'llama3.1', stream: true, messages:)

Instance Attribute Summary collapse

Class Method Summary collapse

Instance Method Summary collapse

Methods included from DTO

#==, #as_array, #as_array_of_hashes, #as_hash, #as_json, #empty?, #to_json

Constructor Details

#initialize(model:, messages:, tools: nil, format: nil, options: nil, stream: nil, keep_alive: nil, think: nil) ⇒ Chat

The initialize method sets up a new instance with streaming behavior.

This method is responsible for initializing a new object instance and configuring it with parameters required for chat interactions. It sets up the model, conversation messages, tools, format, options, streaming behavior, keep-alive duration, and thinking mode.

Parameters:

  • model (String)

    the name of the model to use for chat responses

  • messages (Array<Ollama::Message>, Hash, nil)

    conversation history with roles and content

  • tools (Array<Ollama::Tool>, Hash, nil) (defaults to: nil)

    tools available for function calling

  • format (String, nil) (defaults to: nil)

    response format (e.g., ‘json’)

  • options (Ollama::Options, nil) (defaults to: nil)

    configuration parameters for the model

  • stream (TrueClass, FalseClass, nil) (defaults to: nil)

    whether to enable streaming for the operation

  • keep_alive (String, nil) (defaults to: nil)

    duration to keep the model loaded in memory

  • think (Boolean, nil) (defaults to: nil)

    whether to enable thinking mode for reasoning



43
44
45
46
47
# File 'lib/ollama/commands/chat.rb', line 43

def initialize(model:, messages:, tools: nil, format: nil, options: nil, stream: nil, keep_alive: nil, think: nil)
  @model, @messages, @tools, @format, @options, @stream, @keep_alive, @think =
    model, as_array_of_hashes(messages), as_array_of_hashes(tools),
    format, options, stream, keep_alive, think
end

Instance Attribute Details

#client=(value) ⇒ Object (writeonly)

The client attribute writer allows setting the client instance associated with the object.

This method assigns the client that will be used to perform requests and handle responses for this command. It is typically called internally when a command is executed through a client instance.



99
100
101
# File 'lib/ollama/commands/chat.rb', line 99

def client=(value)
  @client = value
end

#formatString? (readonly)

The format attribute reader returns the response format associated with the object.

Returns:

  • (String, nil)

    response format (e.g., ‘json’)



67
68
69
# File 'lib/ollama/commands/chat.rb', line 67

def format
  @format
end

#keep_aliveString? (readonly)

The keep_alive attribute reader returns the keep-alive duration associated with the object.

Returns:

  • (String, nil)

    duration to keep the model loaded in memory



84
85
86
# File 'lib/ollama/commands/chat.rb', line 84

def keep_alive
  @keep_alive
end

#messagesArray<Ollama::Message>? (readonly)

The messages attribute reader returns the conversation history associated with the object.

Returns:

  • (Array<Ollama::Message>, nil)

    conversation history with roles and content



57
58
59
# File 'lib/ollama/commands/chat.rb', line 57

def messages
  @messages
end

#modelString (readonly)

The model attribute reader returns the model name associated with the object.

Returns:

  • (String)

    the name of the model to use for chat responses



52
53
54
# File 'lib/ollama/commands/chat.rb', line 52

def model
  @model
end

#optionsOllama::Options? (readonly)

The options attribute reader returns the model configuration parameters associated with the object.

Returns:



72
73
74
# File 'lib/ollama/commands/chat.rb', line 72

def options
  @options
end

#streamTrueClass, ... (readonly)

The stream attribute reader returns the streaming behavior setting associated with the object.

Returns:

  • (TrueClass, FalseClass, nil)

    the streaming behavior flag, indicating whether streaming is enabled for the command execution (nil by default)



79
80
81
# File 'lib/ollama/commands/chat.rb', line 79

def stream
  @stream
end

#thinkBoolean? (readonly)

The think attribute reader returns the thinking mode setting associated with the object.

Returns:

  • (Boolean, nil)

    whether thinking mode is enabled for reasoning



89
90
91
# File 'lib/ollama/commands/chat.rb', line 89

def think
  @think
end

#toolsArray<Ollama::Tool>? (readonly)

The tools attribute reader returns the available tools associated with the object.

Returns:

  • (Array<Ollama::Tool>, nil)

    tools available for function calling



62
63
64
# File 'lib/ollama/commands/chat.rb', line 62

def tools
  @tools
end

Class Method Details

.pathString

The path method returns the API endpoint path for chat requests.

This class method provides the specific URL path used to interact with the Ollama API’s chat endpoint. It is utilized internally by the command structure to determine the correct API route for conversational interactions.

Returns:

  • (String)

    the API endpoint path ‘/api/chat’ for chat requests



24
25
26
# File 'lib/ollama/commands/chat.rb', line 24

def self.path
  '/api/chat'
end

Instance Method Details

#perform(handler) ⇒ self

The perform method executes a command request using the specified handler.

This method initiates a POST request to the Ollama API’s chat endpoint, utilizing the client instance to send the request and process responses through the provided handler. It handles both streaming and non-streaming scenarios based on the command’s configuration.

responses

Parameters:

  • handler (Ollama::Handler)

    the handler object responsible for processing API

Returns:

  • (self)

    returns the current instance after initiating the request



113
114
115
# File 'lib/ollama/commands/chat.rb', line 113

def perform(handler)
  @client.request(method: :post, path: self.class.path, body: to_json, stream:, handler:)
end