Class: LLM::Provider Abstract

Inherits:
Object
  • Object
show all
Includes:
HTTPClient
Defined in:
lib/llm/provider.rb

Overview

This class is abstract.
Note:

This class is not meant to be instantiated directly. Instead, use one of the subclasses that implement the methods defined here.

The Provider class represents an abstract class for LLM (Language Model) providers.

See Also:

  • OpenAI
  • Anthropic
  • Gemini
  • Ollama

Direct Known Subclasses

Anthropic, Gemini, Ollama, OpenAI, VoyageAI

Instance Method Summary collapse

Methods included from HTTPClient

#request

Constructor Details

#initialize(secret, host:, port: 443, timeout: 60, ssl: true) ⇒ Provider

Returns a new instance of Provider.

Parameters:

  • secret (String)

    The secret key for authentication

  • host (String)

    The host address of the LLM provider

  • port (Integer) (defaults to: 443)

    The port number

  • timeout (Integer) (defaults to: 60)

    The number of seconds to wait for a response



30
31
32
33
34
35
36
# File 'lib/llm/provider.rb', line 30

def initialize(secret, host:, port: 443, timeout: 60, ssl: true)
  @secret = secret
  @http = Net::HTTP.new(host, port).tap do |http|
    http.use_ssl = ssl
    http.read_timeout = timeout
  end
end

Instance Method Details

#modelsHash<String, LLM::Model>

Returns a hash of available models

Returns:

  • (Hash<String, LLM::Model>)

    Returns a hash of available models

Raises:

  • (NotImplementedError)


116
117
118
# File 'lib/llm/provider.rb', line 116

def models
  raise NotImplementedError
end

#inspectString

Note:

The secret key is redacted in inspect for security reasons

Returns an inspection of the provider object

Returns:

  • (String)


42
43
44
# File 'lib/llm/provider.rb', line 42

def inspect
  "#<#{self.class.name}:0x#{object_id.to_s(16)} @secret=[REDACTED] @http=#{@http.inspect}>"
end

#embed(input, **params) ⇒ LLM::Response::Embedding

Parameters:

  • input (String, Array<String>)

    The input to embed

Returns:

Raises:

  • (NotImplementedError)

    When the method is not implemented by a subclass



52
53
54
# File 'lib/llm/provider.rb', line 52

def embed(input, **params)
  raise NotImplementedError
end

#complete(prompt, role = :user, **params) ⇒ LLM::Response::Completion

Completes a given prompt using the LLM

Examples:

llm = LLM.openai(ENV["KEY"])
context = [
  {role: "system", content: "Answer all of my questions"},
  {role: "system", content: "Your name is Pablo, you are 25 years old and you are my amigo"},
]
res = llm.complete "What is your name and what age are you?", :user, messages: context
print "[#{res.choices[0].role}]", res.choices[0].content, "\n"

Parameters:

  • prompt (String)

    The input prompt to be completed

  • role (Symbol) (defaults to: :user)

    The role of the prompt (e.g. :user, :system)

  • messages (Array<Hash, LLM::Message>)

    The messages to include in the completion

Returns:

Raises:

  • (NotImplementedError)

    When the method is not implemented by a subclass



75
76
77
# File 'lib/llm/provider.rb', line 75

def complete(prompt, role = :user, **params)
  raise NotImplementedError
end

#chat(prompt, role = :user, **params) ⇒ LLM::LazyConversation

Note:

This method creates a lazy variant of a LLM::Conversation object.

Starts a new lazy conversation

Parameters:

  • prompt (String)

    The input prompt to be completed

  • role (Symbol) (defaults to: :user)

    The role of the prompt (e.g. :user, :system)

Returns:

  • (LLM::LazyConversation)

Raises:

  • (NotImplementedError)

    When the method is not implemented by a subclass



88
89
90
# File 'lib/llm/provider.rb', line 88

def chat(prompt, role = :user, **params)
  LLM::Conversation.new(self, params).lazy.chat(prompt, role)
end

#chat!(prompt, role = :user, **params) ⇒ LLM::Conversation

Note:

This method creates a non-lazy variant of a LLM::Conversation object.

Starts a new conversation

Parameters:

  • prompt (String)

    The input prompt to be completed

  • role (Symbol) (defaults to: :user)

    The role of the prompt (e.g. :user, :system)

Returns:

Raises:

  • (NotImplementedError)

    When the method is not implemented by a subclass



101
102
103
# File 'lib/llm/provider.rb', line 101

def chat!(prompt, role = :user, **params)
  LLM::Conversation.new(self, params).chat(prompt, role)
end

#assistant_roleString

Returns the role of the assistant in the conversation. Usually “assistant” or “model”

Returns:

  • (String)

    Returns the role of the assistant in the conversation. Usually “assistant” or “model”

Raises:

  • (NotImplementedError)


109
110
111
# File 'lib/llm/provider.rb', line 109

def assistant_role
  raise NotImplementedError
end