Class: Ollama::Commands::Chat

Inherits:
Object
  • Object
show all
Includes:
DTO
Defined in:
lib/ollama/commands/chat.rb

Overview

A command class that represents the chat API endpoint for Ollama.

This class is used to interact with the Ollama API’s chat endpoint, which generates conversational responses using a specified model. It inherits from the base command structure and provides the necessary functionality to execute chat requests for interactive conversations with language models.

Examples:

Initiating a chat conversation

messages = [
  Ollama::Message.new(role: 'user', content: 'Hello, how are you?'),
  Ollama::Message.new(role: 'assistant', content: 'I am doing well, thank you!')
]
chat = ollama.chat(model: 'llama3.1', stream: true, messages:)

Instance Attribute Summary collapse

Class Method Summary collapse

Instance Method Summary collapse

Methods included from DTO

#==, #as_array, #as_array_of_hashes, #as_hash, #as_json, #empty?, #to_json

Constructor Details

#initialize(model:, messages:, tools: nil, format: nil, options: nil, stream: nil, keep_alive: nil, think: nil) ⇒ Chat

The initialize method sets up a new instance with streaming behavior.

This method is responsible for initializing a new object instance and configuring it with parameters required for chat interactions. It sets up the model, conversation messages, tools, format, options, streaming behavior, keep-alive duration, and thinking mode.

Parameters:

  • model (String)

    the name of the model to use for chat responses

  • messages (Array<Ollama::Message>, Hash, nil)

    conversation history with roles and content

  • tools (Array<Ollama::Tool>, Hash, nil) (defaults to: nil)

    tools available for function calling

  • format (String, nil) (defaults to: nil)

    response format (e.g., ‘json’)

  • options (Ollama::Options, nil) (defaults to: nil)

    configuration parameters for the model

  • stream (TrueClass, FalseClass, nil) (defaults to: nil)

    whether to enable streaming for the operation

  • keep_alive (String, nil) (defaults to: nil)

    duration to keep the model loaded in memory

  • think (Boolean, String, nil) (defaults to: nil)

    whether to enable thinking mode for generation. Can be “high”, “medium”, “low” instead of true



44
45
46
47
48
# File 'lib/ollama/commands/chat.rb', line 44

def initialize(model:, messages:, tools: nil, format: nil, options: nil, stream: nil, keep_alive: nil, think: nil)
  @model, @messages, @tools, @format, @options, @stream, @keep_alive, @think =
    model, as_array_of_hashes(messages), as_array_of_hashes(tools),
    format, options, stream, keep_alive, think
end

Instance Attribute Details

#client=(value) ⇒ Object (writeonly)

The client attribute writer allows setting the client instance associated with the object.

This method assigns the client that will be used to perform requests and handle responses for this command. It is typically called internally when a command is executed through a client instance.



100
101
102
# File 'lib/ollama/commands/chat.rb', line 100

def client=(value)
  @client = value
end

#formatString? (readonly)

The format attribute reader returns the response format associated with the object.

Returns:

  • (String, nil)

    response format (e.g., ‘json’)



68
69
70
# File 'lib/ollama/commands/chat.rb', line 68

def format
  @format
end

#keep_aliveString? (readonly)

The keep_alive attribute reader returns the keep-alive duration associated with the object.

Returns:

  • (String, nil)

    duration to keep the model loaded in memory



85
86
87
# File 'lib/ollama/commands/chat.rb', line 85

def keep_alive
  @keep_alive
end

#messagesArray<Ollama::Message>? (readonly)

The messages attribute reader returns the conversation history associated with the object.

Returns:

  • (Array<Ollama::Message>, nil)

    conversation history with roles and content



58
59
60
# File 'lib/ollama/commands/chat.rb', line 58

def messages
  @messages
end

#modelString (readonly)

The model attribute reader returns the model name associated with the object.

Returns:

  • (String)

    the name of the model to use for chat responses



53
54
55
# File 'lib/ollama/commands/chat.rb', line 53

def model
  @model
end

#optionsOllama::Options? (readonly)

The options attribute reader returns the model configuration parameters associated with the object.

Returns:



73
74
75
# File 'lib/ollama/commands/chat.rb', line 73

def options
  @options
end

#streamTrueClass, ... (readonly)

The stream attribute reader returns the streaming behavior setting associated with the object.

Returns:

  • (TrueClass, FalseClass, nil)

    the streaming behavior flag, indicating whether streaming is enabled for the command execution (nil by default)



80
81
82
# File 'lib/ollama/commands/chat.rb', line 80

def stream
  @stream
end

#thinkBoolean, ... (readonly)

The think attribute reader returns the thinking mode setting associated with the object.

Returns:

  • (Boolean, String, nil)

    whether thinking mode is enabled for reasoning



90
91
92
# File 'lib/ollama/commands/chat.rb', line 90

def think
  @think
end

#toolsArray<Ollama::Tool>? (readonly)

The tools attribute reader returns the available tools associated with the object.

Returns:

  • (Array<Ollama::Tool>, nil)

    tools available for function calling



63
64
65
# File 'lib/ollama/commands/chat.rb', line 63

def tools
  @tools
end

Class Method Details

.pathString

The path method returns the API endpoint path for chat requests.

This class method provides the specific URL path used to interact with the Ollama API’s chat endpoint. It is utilized internally by the command structure to determine the correct API route for conversational interactions.

Returns:

  • (String)

    the API endpoint path ‘/api/chat’ for chat requests



24
25
26
# File 'lib/ollama/commands/chat.rb', line 24

def self.path
  '/api/chat'
end

Instance Method Details

#perform(handler) ⇒ self

The perform method executes a command request using the specified handler.

This method initiates a POST request to the Ollama API’s chat endpoint, utilizing the client instance to send the request and process responses through the provided handler. It handles both streaming and non-streaming scenarios based on the command’s configuration.

responses

Parameters:

  • handler (Ollama::Handler)

    the handler object responsible for processing API

Returns:

  • (self)

    returns the current instance after initiating the request



113
114
115
# File 'lib/ollama/commands/chat.rb', line 113

def perform(handler)
  @client.request(method: :post, path: self.class.path, body: to_json, stream:, handler:)
end