Module: LLM

Defined in:
lib/llm.rb,
lib/llm/bot.rb,
lib/llm/error.rb,
lib/llm/buffer.rb,
lib/llm/client.rb,
lib/llm/message.rb,
lib/llm/version.rb,
lib/llm/response.rb,
lib/llm/eventhandler.rb,
lib/llm/providers/xai.rb,
lib/llm/providers/zai.rb,
lib/llm/providers/gemini.rb,
lib/llm/providers/ollama.rb,
lib/llm/providers/openai.rb,
lib/llm/providers/deepseek.rb,
lib/llm/providers/llamacpp.rb,
lib/llm/providers/anthropic.rb

Defined Under Namespace

Modules: Client Classes: Anthropic, Bot, Buffer, DeepSeek, Error, File, Function, Gemini, LlamaCpp, Message, Object, Ollama, OpenAI, Provider, Response, ResponseError, Schema, ServerTool, Tool, XAI, ZAI

Constant Summary collapse

UnauthorizedError =

HTTPUnauthorized

Class.new(ResponseError)
RateLimitError =

HTTPTooManyRequests

Class.new(ResponseError)
ServerError =

HTTPServerError

Class.new(ResponseError)
NoImageError =

When no images are found in a response

Class.new(ResponseError)
FormatError =

When an given an input object that is not understood

Class.new(Error)
PromptError =

When given a prompt object that is not understood

Class.new(FormatError)
VERSION =
"1.0.0"

Class Method Summary collapse

Class Method Details

.File(obj) ⇒ LLM::File

Parameters:

  • obj (String, File, LLM::Response)

    The path to the file, or an existing file reference

Returns:



82
83
84
85
86
87
88
89
90
91
# File 'lib/llm/file.rb', line 82

def LLM.File(obj)
  case obj
  when File
    obj.close unless obj.closed?
    LLM.File(obj.path)
  when LLM::File, LLM::Response then obj
  when String then LLM::File.new(obj)
  else raise TypeError, "don't know how to handle #{obj.class} objects"
  end
end

.geminiGemini

Returns a new instance of Gemini.

Parameters:

  • key (String, nil)

    The secret key for authentication

  • host (String)

    The host address of the LLM provider

  • port (Integer)

    The port number

  • timeout (Integer)

    The number of seconds to wait for a response

  • ssl (Boolean)

    Whether to use SSL for the connection

  • persistent (Boolean)

    Whether to use a persistent connection. Requires the net-http-persistent gem.

Returns:

  • (Gemini)

    a new instance of Gemini



41
42
43
44
# File 'lib/llm.rb', line 41

def gemini(**)
  lock(:require) { require_relative "llm/providers/gemini" unless defined?(LLM::Gemini) }
  LLM::Gemini.new(**)
end

.ollama(key: nil) ⇒ Ollama

Returns a new instance of Ollama.

Parameters:

  • key (String, nil) (defaults to: nil)

    The secret key for authentication

Returns:

  • (Ollama)

    a new instance of Ollama



49
50
51
52
# File 'lib/llm.rb', line 49

def ollama(key: nil, **)
  lock(:require) { require_relative "llm/providers/ollama" unless defined?(LLM::Ollama) }
  LLM::Ollama.new(key:, **)
end

.llamacpp(key: nil) ⇒ LLM::LlamaCpp

Parameters:

  • key (String, nil) (defaults to: nil)

    The secret key for authentication

Returns:



57
58
59
60
# File 'lib/llm.rb', line 57

def llamacpp(key: nil, **)
  lock(:require) { require_relative "llm/providers/llamacpp" unless defined?(LLM::LlamaCpp) }
  LLM::LlamaCpp.new(key:, **)
end

.deepseekLLM::DeepSeek

Parameters:

  • key (String, nil)

    The secret key for authentication

Returns:



65
66
67
68
# File 'lib/llm.rb', line 65

def deepseek(**)
  lock(:require) { require_relative "llm/providers/deepseek" unless defined?(LLM::DeepSeek) }
  LLM::DeepSeek.new(**)
end

.openaiOpenAI

Returns a new instance of OpenAI.

Parameters:

  • key (String, nil)

    The secret key for authentication

Returns:

  • (OpenAI)

    a new instance of OpenAI



73
74
75
76
# File 'lib/llm.rb', line 73

def openai(**)
  lock(:require) { require_relative "llm/providers/openai" unless defined?(LLM::OpenAI) }
  LLM::OpenAI.new(**)
end

.xaiXAI

Returns a new instance of XAI.

Parameters:

  • key (String, nil)

    The secret key for authentication

  • host (String)

    A regional host or the default (“api.x.ai”)

Returns:

  • (XAI)

    a new instance of XAI



82
83
84
85
# File 'lib/llm.rb', line 82

def xai(**)
  lock(:require) { require_relative "llm/providers/xai" unless defined?(LLM::XAI) }
  LLM::XAI.new(**)
end

.zaiZAI

Returns a new instance of ZAI.

Parameters:

  • key (String, nil)

    The secret key for authentication

  • host (String)

    A regional host or the default (“api.z.ai”)

Returns:

  • (ZAI)

    a new instance of ZAI



91
92
93
94
# File 'lib/llm.rb', line 91

def zai(**)
  lock(:require) { require_relative "llm/providers/zai" unless defined?(LLM::ZAI) }
  LLM::ZAI.new(**)
end

.function(key, &b) ⇒ LLM::Function

Define a function

Examples:

LLM.function(:system) do |fn|
  fn.description "Run system command"
  fn.params do |schema|
    schema.object(command: schema.string.required)
  end
  fn.define do |command:|
    system(command)
  end
end

Parameters:

  • key (Symbol)

    The function name / key

  • b (Proc)

    The block to define the function

Returns:



111
112
113
# File 'lib/llm.rb', line 111

def function(key, &b)
  LLM::Function.new(key, &b)
end

.lock(name) ⇒ void

This method returns an undefined value.

Provides a thread-safe lock

Parameters:

  • name (Symbol)

    The name of the lock

  • & (Proc)

    The block to execute within the lock



120
# File 'lib/llm.rb', line 120

def lock(name, &) = @monitors[name].synchronize(&)

.anthropicAnthropic

Returns a new instance of Anthropic.

Parameters:

  • key (String, nil)

    The secret key for authentication

  • host (String)

    The host address of the LLM provider

  • port (Integer)

    The port number

  • timeout (Integer)

    The number of seconds to wait for a response

  • ssl (Boolean)

    Whether to use SSL for the connection

  • persistent (Boolean)

    Whether to use a persistent connection. Requires the net-http-persistent gem.

Returns:

  • (Anthropic)

    a new instance of Anthropic



33
34
35
36
# File 'lib/llm.rb', line 33

def anthropic(**)
  lock(:require) { require_relative "llm/providers/anthropic" unless defined?(LLM::Anthropic) }
  LLM::Anthropic.new(**)
end