Class: LLM::Provider Abstract

Inherits:
Object
  • Object
show all
Includes:
Transport::HTTP::Execution
Defined in:
lib/llm/provider.rb,
lib/llm/provider/transport/http.rb,
lib/llm/provider/transport/http/interruptible.rb

Overview

This class is abstract.

The Provider class represents an abstract class for LLM (Language Model) providers.

Direct Known Subclasses

Anthropic, Google, Ollama, OpenAI

Defined Under Namespace

Modules: Transport

Instance Method Summary collapse

Constructor Details

#initialize(key:, host:, port: 443, timeout: 60, ssl: true, base_path: "", persistent: false) ⇒ Provider

Returns a new instance of Provider.

Parameters:

  • key (String, nil)

    The secret key for authentication

  • host (String)

    The host address of the LLM provider

  • port (Integer) (defaults to: 443)

    The port number

  • timeout (Integer) (defaults to: 60)

    The number of seconds to wait for a response

  • ssl (Boolean) (defaults to: true)

    Whether to use SSL for the connection

  • base_path (String) (defaults to: "")

    Optional base path prefix for HTTP API routes.

  • persistent (Boolean) (defaults to: false)

    Whether to use a persistent connection. Requires the net-http-persistent gem.



30
31
32
33
34
35
36
37
38
39
40
41
# File 'lib/llm/provider.rb', line 30

def initialize(key:, host:, port: 443, timeout: 60, ssl: true, base_path: "", persistent: false)
  @key = key
  @host = host
  @port = port
  @timeout = timeout
  @ssl = ssl
  @base_path = normalize_base_path(base_path)
  @base_uri = URI("#{ssl ? "https" : "http"}://#{host}:#{port}/")
  @headers = {"User-Agent" => "llm.rb v#{LLM::VERSION}"}
  @transport = Transport::HTTP.new(host:, port:, timeout:, ssl:, persistent:)
  @monitor = Monitor.new
end

Instance Method Details

#streamable?(stream) ⇒ Boolean

Parameters:

Returns:

  • (Boolean)


344
345
346
# File 'lib/llm/provider.rb', line 344

def streamable?(stream)
  LLM::Stream === stream || stream.respond_to?(:<<)
end

#inspectString

Note:

The secret key is redacted in inspect for security reasons

Returns an inspection of the provider object

Returns:

  • (String)


47
48
49
# File 'lib/llm/provider.rb', line 47

def inspect
  "#<#{self.class.name}:0x#{object_id.to_s(16)} @key=[REDACTED] @transport=#{transport.inspect} @tracer=#{tracer.inspect}>"
end

#nameSymbol

Returns the provider's name

Returns:

  • (Symbol)

    Returns the provider's name

Raises:

  • (NotImplementedError)

    When the method is not implemented by a subclass



56
57
58
# File 'lib/llm/provider.rb', line 56

def name
  raise NotImplementedError
end

#embed(input, model: nil, **params) ⇒ LLM::Response

Provides an embedding

Parameters:

  • input (String, Array<String>)

    The input to embed

  • model (String) (defaults to: nil)

    The embedding model to use

  • params (Hash)

    Other embedding parameters

Returns:

Raises:

  • (NotImplementedError)

    When the method is not implemented by a subclass



71
72
73
# File 'lib/llm/provider.rb', line 71

def embed(input, model: nil, **params)
  raise NotImplementedError
end

#complete(prompt, params = {}) ⇒ LLM::Response

Provides an interface to the chat completions API

Examples:

llm = LLM.openai(key: ENV["KEY"])
messages = [{role: "system", content: "Your task is to answer all of my questions"}]
res = llm.complete("5 + 2 ?", messages:)
print "[#{res.messages[0].role}]", res.messages[0].content, "\n"

Parameters:

  • prompt (String)

    The input prompt to be completed

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Options Hash (params):

  • :role (Symbol)

    Defaults to the provider's default role

  • :model (String)

    Defaults to the provider's default model

  • :schema (#to_json, nil)

    Defaults to nil

  • :tools (Array<LLM::Function>, nil)

    Defaults to nil

Returns:

Raises:

  • (NotImplementedError)

    When the method is not implemented by a subclass



95
96
97
# File 'lib/llm/provider.rb', line 95

def complete(prompt, params = {})
  raise NotImplementedError
end

#chat(prompt, params = {}) ⇒ LLM::Context

Starts a new chat powered by the chat completions API

Parameters:

  • prompt (String)

    The input prompt to be completed

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Returns:



104
105
106
107
# File 'lib/llm/provider.rb', line 104

def chat(prompt, params = {})
  role = params.delete(:role)
  LLM::Context.new(self, params).talk(prompt, role:)
end

#respond(prompt, params = {}) ⇒ LLM::Context

Starts a new chat powered by the responses API

Parameters:

  • prompt (String)

    The input prompt to be completed

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Returns:

Raises:

  • (NotImplementedError)

    When the method is not implemented by a subclass



115
116
117
118
# File 'lib/llm/provider.rb', line 115

def respond(prompt, params = {})
  role = params.delete(:role)
  LLM::Context.new(self, params).respond(prompt, role:)
end

#responsesLLM::OpenAI::Responses

Note:

Compared to the chat completions API, the responses API can require less bandwidth on each turn, maintain state server-side, and produce faster responses.

Returns:

Raises:

  • (NotImplementedError)


127
128
129
# File 'lib/llm/provider.rb', line 127

def responses
  raise NotImplementedError
end

#imagesLLM::OpenAI::Images, LLM::Google::Images

Returns an interface to the images API

Returns:

Raises:

  • (NotImplementedError)


134
135
136
# File 'lib/llm/provider.rb', line 134

def images
  raise NotImplementedError
end

#audioLLM::OpenAI::Audio

Returns an interface to the audio API

Returns:

Raises:

  • (NotImplementedError)


141
142
143
# File 'lib/llm/provider.rb', line 141

def audio
  raise NotImplementedError
end

#filesLLM::OpenAI::Files

Returns an interface to the files API

Returns:

Raises:

  • (NotImplementedError)


148
149
150
# File 'lib/llm/provider.rb', line 148

def files
  raise NotImplementedError
end

#modelsLLM::OpenAI::Models

Returns an interface to the models API

Returns:

Raises:

  • (NotImplementedError)


155
156
157
# File 'lib/llm/provider.rb', line 155

def models
  raise NotImplementedError
end

#moderationsLLM::OpenAI::Moderations

Returns an interface to the moderations API

Returns:

Raises:

  • (NotImplementedError)


162
163
164
# File 'lib/llm/provider.rb', line 162

def moderations
  raise NotImplementedError
end

#vector_storesLLM::OpenAI::VectorStore

Returns an interface to the vector stores API

Returns:

  • (LLM::OpenAI::VectorStore)

    Returns an interface to the vector stores API

Raises:

  • (NotImplementedError)


169
170
171
# File 'lib/llm/provider.rb', line 169

def vector_stores
  raise NotImplementedError
end

#assistant_roleString

Returns the role of the assistant in the conversation. Usually "assistant" or "model"

Returns:

  • (String)

    Returns the role of the assistant in the conversation. Usually "assistant" or "model"

Raises:

  • (NotImplementedError)


177
178
179
# File 'lib/llm/provider.rb', line 177

def assistant_role
  raise NotImplementedError
end

#default_modelString

Returns the default model for chat completions

Returns:

  • (String)

    Returns the default model for chat completions

Raises:

  • (NotImplementedError)


184
185
186
# File 'lib/llm/provider.rb', line 184

def default_model
  raise NotImplementedError
end

#schemaLLM::Schema

Returns an object that can generate a JSON schema

Returns:



191
192
193
# File 'lib/llm/provider.rb', line 191

def schema
  LLM::Schema.new
end

#with(headers:) ⇒ LLM::Provider

Add one or more headers to all requests

Examples:

llm = LLM.openai(key: ENV["KEY"])
llm.with(headers: {"OpenAI-Organization" => ENV["ORG"]})
llm.with(headers: {"OpenAI-Project" => ENV["PROJECT"]})

Parameters:

  • headers (Hash<String,String>)

    One or more headers

Returns:



205
206
207
208
209
# File 'lib/llm/provider.rb', line 205

def with(headers:)
  lock do
    tap { @headers.merge!(headers) }
  end
end

#server_toolsString => LLM::ServerTool

Note:

This method might be outdated, and the LLM::Provider#server_tool method can be used if a tool is not found here.

Returns all known tools provided by a provider.

Returns:



217
218
219
# File 'lib/llm/provider.rb', line 217

def server_tools
  {}
end

#server_tool(name, options = {}) ⇒ LLM::ServerTool

Note:

OpenAI, Anthropic, and Gemini provide platform-tools for things like web search, and more.

Returns a tool provided by a provider.

Examples:

llm   = LLM.openai(key: ENV["KEY"])
tools = [llm.server_tool(:web_search)]
res   = llm.responses.create("Summarize today's news", tools:)
print res.output_text, "\n"

Parameters:

  • name (String, Symbol)

    The name of the tool

  • options (Hash) (defaults to: {})

    Configuration options for the tool

Returns:



234
235
236
# File 'lib/llm/provider.rb', line 234

def server_tool(name, options = {})
  LLM::ServerTool.new(name, options, self)
end

#web_search(query:) ⇒ LLM::Response

Provides a web search capability

Parameters:

  • query (String)

    The search query

Returns:

Raises:

  • (NotImplementedError)

    When the method is not implemented by a subclass



244
245
246
# File 'lib/llm/provider.rb', line 244

def web_search(query:)
  raise NotImplementedError
end

#user_roleSymbol

Returns:

  • (Symbol)


250
251
252
# File 'lib/llm/provider.rb', line 250

def user_role
  :user
end

#system_roleSymbol

Returns:

  • (Symbol)


256
257
258
# File 'lib/llm/provider.rb', line 256

def system_role
  :system
end

#developer_roleSymbol

Returns:

  • (Symbol)


262
263
264
# File 'lib/llm/provider.rb', line 262

def developer_role
  :developer
end

#tool_roleSymbol

Returns:

  • (Symbol)


268
269
270
# File 'lib/llm/provider.rb', line 268

def tool_role
  :tool
end

#tracerLLM::Tracer

Returns the current scoped tracer override or provider default tracer

Returns:

  • (LLM::Tracer)

    Returns the current scoped tracer override or provider default tracer



275
276
277
# File 'lib/llm/provider.rb', line 275

def tracer
  weakmap[self] || @tracer || LLM::Tracer::Null.new(self)
end

#tracer=(tracer) ⇒ void

This method returns an undefined value.

Set the provider's default tracer This tracer is shared by the provider instance and becomes the fallback whenever no scoped override is active.

Examples:

llm = LLM.openai(key: ENV["KEY"])
llm.tracer = LLM::Tracer::Logger.new(llm, path: "/path/to/log.txt")

Parameters:



289
290
291
# File 'lib/llm/provider.rb', line 289

def tracer=(tracer)
  @tracer = tracer
end

#with_tracer(tracer) { ... } ⇒ Object

Override the tracer for the current fiber while the block runs. This is useful when you want per-request or per-turn tracing without replacing the provider's default tracer.

Examples:

llm.with_tracer(LLM::Tracer::Logger.new(llm, io: $stdout)) do
  llm.complete("hello", model: "gpt-5.4-mini")
end

Parameters:

Yields:

    Returns:

    
    
    304
    305
    306
    307
    308
    309
    310
    311
    312
    313
    314
    315
    316
    317
    # File 'lib/llm/provider.rb', line 304
    
    def with_tracer(tracer)
      had_override = weakmap.key?(self)
      previous = weakmap[self]
      weakmap[self] = tracer
      yield
    ensure
      if had_override
        weakmap[self] = previous
      elsif weakmap.respond_to?(:delete)
        weakmap.delete(self)
      else
        weakmap[self] = nil
      end
    end

    #persist!LLM::Provider Also known as: persistent

    This method configures a provider to use a persistent connection pool via the optional dependency Net::HTTP::Persistent

    Examples:

    llm = LLM.openai(key: ENV["KEY"]).persistent
    # do something with 'llm'

    Returns:

    
    
    326
    327
    328
    329
    # File 'lib/llm/provider.rb', line 326
    
    def persist!
      transport.persist!
      self
    end

    #interrupt!(owner) ⇒ nil Also known as: cancel!

    Interrupt the active request, if any.

    Parameters:

    • owner (Fiber)

    Returns:

    • (nil)
    
    
    336
    337
    338
    # File 'lib/llm/provider.rb', line 336
    
    def interrupt!(owner)
      transport.interrupt!(owner)
    end