Class: LLM::Agent

Inherits:
Object
  • Object
show all
Defined in:
lib/llm/agent.rb

Overview

LLM::Agent provides a class-level DSL for defining reusable, preconfigured assistants with defaults for model, tools, schema, and instructions.

Notes:

  • Instructions are injected only on the first request.
  • An agent will automatically execute tool calls (unlike LLM::Session).
  • The idea originally came from RubyLLM and was adapted to llm.rb.

Examples:

class SystemAdmin < LLM::Agent
  model "gpt-4.1-nano"
  instructions "You are a Linux system admin"
  tools Shell
  schema Result
end

llm = LLM.openai(key: ENV["KEY"])
agent = SystemAdmin.new(llm)
agent.talk("Run 'date'")

Class Method Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(provider, params = {}) ⇒ Agent

Returns a new instance of Agent.

Parameters:

  • provider (LLM::Provider)

    A provider

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Options Hash (params):

  • :model (String)

    Defaults to the provider's default model

  • :tools (Array<LLM::Function>, nil)

    Defaults to nil

  • :schema (#to_json, nil)

    Defaults to nil



80
81
82
83
84
85
# File 'lib/llm/agent.rb', line 80

def initialize(provider, params = {})
  defaults = {model: self.class.model, tools: self.class.tools, schema: self.class.schema}.compact
  @provider = provider
  @ses = LLM::Session.new(provider, defaults.merge(params))
  @instructions_applied = false
end

Class Method Details

.tools(*tools) ⇒ Array<LLM::Function>

Set or get the default tools

Parameters:

Returns:

  • (Array<LLM::Function>)

    Returns the current tools when no argument is provided



43
44
45
46
# File 'lib/llm/agent.rb', line 43

def self.tools(*tools)
  return @tools || [] if tools.empty?
  @tools = tools.flatten
end

.schema(schema = nil) ⇒ #to_json?

Set or get the default schema

Parameters:

  • schema (#to_json, nil) (defaults to: nil)

    The schema

Returns:

  • (#to_json, nil)

    Returns the current schema when no argument is provided



54
55
56
57
# File 'lib/llm/agent.rb', line 54

def self.schema(schema = nil)
  return @schema if schema.nil?
  @schema = schema
end

.instructions(instructions = nil) ⇒ String?

Set or get the default instructions

Parameters:

  • instructions (String, nil) (defaults to: nil)

    The system instructions

Returns:

  • (String, nil)

    Returns the current instructions when no argument is provided



65
66
67
68
# File 'lib/llm/agent.rb', line 65

def self.instructions(instructions = nil)
  return @instructions if instructions.nil?
  @instructions = instructions
end

.model(model = nil) ⇒ String?

Set or get the default model

Parameters:

  • model (String, nil) (defaults to: nil)

    The model identifier

Returns:

  • (String, nil)

    Returns the current model when no argument is provided



32
33
34
35
# File 'lib/llm/agent.rb', line 32

def self.model(model = nil)
  return @model if model.nil?
  @model = model
end

Instance Method Details

#modelString

Returns the model an Agent is actively using

Returns:

  • (String)


203
204
205
# File 'lib/llm/agent.rb', line 203

def model
  @ses.model
end

#talk(prompt, params = {}) ⇒ LLM::Response Also known as: chat

Maintain a conversation via the chat completions API. This method immediately sends a request to the LLM and returns the response.

Examples:

llm = LLM.openai(key: ENV["KEY"])
agent = LLM::Agent.new(llm)
response = agent.talk("Hello, what is your name?")
puts response.choices[0].content

Parameters:

  • params (Hash) (defaults to: {})

    The params passed to the provider, including optional :stream, :tools, :schema etc.

  • prompt (String)

    The input prompt to be completed

Options Hash (params):

  • :max_tool_rounds (Integer)

    The maxinum number of tool call iterations (default 10)

Returns:



100
101
102
103
104
105
106
107
108
109
110
# File 'lib/llm/agent.rb', line 100

def talk(prompt, params = {})
  i, max = 0, Integer(params.delete(:max_tool_rounds) || 10)
  res = @ses.talk(apply_instructions(prompt), params)
  until @ses.functions.empty?
    raise LLM::ToolLoopError, "pending tool calls remain" if i >= max
    res = @ses.talk @ses.functions.map(&:call), params
    i += 1
  end
  @instructions_applied = true
  res
end

#respond(prompt, params = {}) ⇒ LLM::Response

Note:

Not all LLM providers support this API

Maintain a conversation via the responses API. This method immediately sends a request to the LLM and returns the response.

Examples:

llm = LLM.openai(key: ENV["KEY"])
agent = LLM::Agent.new(llm)
res = agent.respond("What is the capital of France?")
puts res.output_text

Parameters:

  • params (Hash) (defaults to: {})

    The params passed to the provider, including optional :stream, :tools, :schema etc.

  • prompt (String)

    The input prompt to be completed

Options Hash (params):

  • :max_tool_rounds (Integer)

    The maxinum number of tool call iterations (default 10)

Returns:



127
128
129
130
131
132
133
134
135
136
137
# File 'lib/llm/agent.rb', line 127

def respond(prompt, params = {})
  i, max = 0, Integer(params.delete(:max_tool_rounds) || 10)
  res = @ses.respond(apply_instructions(prompt), params)
  until @ses.functions.empty?
    raise LLM::ToolLoopError, "pending tool calls remain" if i >= max
    res = @ses.respond @ses.functions.map(&:call), params
    i += 1
  end
  @instructions_applied = true
  res
end

#messagesLLM::Buffer<LLM::Message>



141
142
143
# File 'lib/llm/agent.rb', line 141

def messages
  @ses.messages
end

#functionsArray<LLM::Function>

Returns:



147
148
149
# File 'lib/llm/agent.rb', line 147

def functions
  @ses.functions
end

#usageLLM::Object

Returns:



153
154
155
# File 'lib/llm/agent.rb', line 153

def usage
  @ses.usage
end

#prompt(&b) ⇒ LLM::Prompt Also known as: build_prompt

Parameters:

  • b (Proc)

    A block that composes messages. If it takes one argument, it receives the prompt object. Otherwise it runs in prompt context.

Returns:

See Also:



161
162
163
# File 'lib/llm/agent.rb', line 161

def prompt(&b)
  @ses.prompt(&b)
end

#image_url(url) ⇒ LLM::Object

Returns a tagged object

Parameters:

  • url (String)

    The URL

Returns:



171
172
173
# File 'lib/llm/agent.rb', line 171

def image_url(url)
  @ses.image_url(url)
end

#local_file(path) ⇒ LLM::Object

Returns a tagged object

Parameters:

  • path (String)

    The path

Returns:



180
181
182
# File 'lib/llm/agent.rb', line 180

def local_file(path)
  @ses.local_file(path)
end

#remote_file(res) ⇒ LLM::Object

Returns a tagged object

Parameters:

Returns:



189
190
191
# File 'lib/llm/agent.rb', line 189

def remote_file(res)
  @ses.remote_file(res)
end

#tracerLLM::Tracer

Returns an LLM tracer

Returns:



196
197
198
# File 'lib/llm/agent.rb', line 196

def tracer
  @ses.tracer
end