Module: OllamaChat::ModelHandling

Included in:
Chat
Defined in:
lib/ollama_chat/model_handling.rb

Overview

A module that provides functionality for managing Ollama models, including checking model availability, pulling models from remote servers, and handling model presence verification.

This module encapsulates the logic for interacting with Ollama models, ensuring that required models are available locally before attempting to use them in chat sessions. It handles both local model verification and remote model retrieval when necessary.

Examples:

Checking if a model is present

chat.model_present?('llama3.1')

Pulling a model from a remote server

chat.pull_model_from_remote('mistral')

Ensuring a model is available locally

chat.pull_model_unless_present('phi3', {})

Defined Under Namespace

Classes: ModelMetadata

Instance Method Summary collapse

Instance Method Details

#choose_model(cli_model, current_model) ⇒ Object (private)

The choose_model method selects a model from the available list based on CLI input or user interaction. It processes the provided CLI model parameter to determine if a regex selector is used, filters the models accordingly, and prompts the user to choose from the filtered list if needed. The method ensures that a model is selected and displays a connection message with the chosen model and base URL.



144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
# File 'lib/ollama_chat/model_handling.rb', line 144

def choose_model(cli_model, current_model)
  selector = if cli_model =~ /\A\?+(.*)\z/
               cli_model = ''
               Regexp.new($1)
             end
  models = ollama.tags.models.sort_by(&:name).map { |m| model_with_size(m) }
  selector and models = models.select { _1.value =~ selector }
  model =
    if models.size == 1
      models.first
    elsif cli_model == ''
      OllamaChat::Utils::Chooser.choose(models)&.value || current_model
    else
      cli_model || current_model
    end
ensure
  connect_message(model, ollama.base_url)
end

#model_present?(model) ⇒ ModelMetadata, FalseClass (private)

The model_present? method checks if the specified Ollama model is available.

Parameters:

  • model (String)

    the name of the Ollama model

Returns:

  • (ModelMetadata, FalseClass)

    if the model is present, false otherwise



44
45
46
47
48
49
50
51
52
53
54
# File 'lib/ollama_chat/model_handling.rb', line 44

def model_present?(model)
  ollama.show(model:) do |md|
    return ModelMetadata.new(
      model,
      md.system,
      md.capabilities,
    )
  end
rescue Ollama::Errors::NotFoundError
  false
end

#model_with_size(model) ⇒ Object (private)

The model_with_size method formats a model’s size for display by creating a formatted string that includes the model name and its size in a human-readable format with appropriate units.

Parameters:

  • model (Object)

    the model object that has name and size attributes

Returns:

  • (Object)

    a result object with an overridden to_s method that combines the model name and formatted size



96
97
98
99
100
101
# File 'lib/ollama_chat/model_handling.rb', line 96

private def model_with_size(model)
  formatted_size = Term::ANSIColor.bold {
    Tins::Unit.format(model.size, unit: ?B, prefix: 1024, format: '%.1f %U')
  }
  SearchUI::Wrapper.new(model.name, display: "#{model.name} #{formatted_size}")
end

#pull_model_from_remote(model) ⇒ Object (private)

The pull_model_from_remote method attempts to retrieve a model from the remote server if it is not found locally.

Parameters:

  • model (String)

    the name of the model to be pulled



60
61
62
63
# File 'lib/ollama_chat/model_handling.rb', line 60

def pull_model_from_remote(model)
  STDOUT.puts "Model #{bold{model}} not found locally, attempting to pull it from remote now…"
  ollama.pull(model:)
end

#pull_model_unless_present(model) ⇒ ModelMetadata (private)

The pull_model_unless_present method ensures that a specified model is available on the Ollama server. It first checks if the model metadata exists locally; if not, it pulls the model from a remote source and verifies its presence again. If the model still cannot be found, it raises an UnknownModelError indicating the missing model name.

Parameters:

  • model (String)

    the name of the model to ensure is present

Returns:

Raises:



76
77
78
79
80
81
82
83
84
85
86
# File 'lib/ollama_chat/model_handling.rb', line 76

def pull_model_unless_present(model)
  if  = model_present?(model)
    return 
  else
    pull_model_from_remote(model)
    if  = model_present?(model)
      return 
    end
    raise OllamaChat::UnknownModelError, "unknown model named #{@model.inspect}"
  end
end

#use_model(model = nil) ⇒ ModelMetadata (private)

The use_model method selects and sets the model to be used for the chat session.

It allows specifying a particular model or defaults to the current model. After selecting, it pulls the model metadata if necessary. If think? is true and the chosen model does not support thinking, the think mode selector is set to ‘disabled’. If tools_support.on? is true and the chosen model does not support tools, tool support is disabled. Returns the metadata for the selected model.

Parameters:

  • model (String, nil) (defaults to: nil)

    the model name to use; if omitted, the current model is retained

Returns:



117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
# File 'lib/ollama_chat/model_handling.rb', line 117

def use_model(model = nil)
  if model.nil?
    @model = choose_model('', @model)
  else
    @model = choose_model(model, config.model.name)
  end

  @model_metadata = pull_model_unless_present(@model)

  if think? && !@model_metadata.can?('thinking')
    think_mode.selected = 'disabled'
  end

  if tools_support.on? && !@model_metadata.can?('tools')
    tools_support.set false
  end

  @model_metadata
end