Class: LLM::Agent

Inherits:
Object
  • Object
show all
Defined in:
lib/llm/agent.rb

Overview

LLM::Agent provides a class-level DSL for defining reusable, preconfigured assistants with defaults for model, tools, schema, and instructions.

It wraps the same stateful runtime surface as LLM::Context: message history, usage, persistence, streaming parameters, and provider-backed requests still flow through an underlying context. The defining behavior of an agent is that it automatically resolves pending tool calls for you during talk and respond, instead of leaving tool loops to the caller.

Notes:

  • Instructions are injected once unless a system message is already present.
  • An agent automatically executes tool loops (unlike LLM::Context).
  • The automatic tool loop enables the wrapped context's guard by default. The built-in LLM::LoopGuard detects repeated tool-call patterns and blocks stuck execution before more tool work is queued.
  • The default tool attempt budget is 25. After that, the agent sends advisory tool errors back through the model and keeps the loop in-band. Set tool_attempts: nil to disable that advisory behavior.
  • Tool loop execution can be configured with concurrency :call, :thread, :task, :fiber, :ractor, or a list of queued task types such as [:thread, :ractor].

Examples:

class SystemAdmin < LLM::Agent
  model "gpt-4.1-nano"
  instructions "You are a Linux system admin"
  tools Shell
  schema Result
end

llm = LLM.openai(key: ENV["KEY"])
agent = SystemAdmin.new(llm)
agent.talk("Run 'date'")

Instance Attribute Summary collapse

Class Method Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(llm, params = {}) ⇒ Agent

Returns a new instance of Agent.

Parameters:

  • provider (LLM::Provider)

    A provider

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Options Hash (params):

  • :model (String)

    Defaults to the provider's default model

  • :tools (Array<LLM::Function>, nil)

    Defaults to nil

  • :skills (Array<String>, nil)

    Defaults to nil

  • :schema (#to_json, nil)

    Defaults to nil

  • :tracer (LLM::Tracer, Proc, nil)

    Optional tracer override for this agent instance

  • :concurrency (Symbol, Array<Symbol>, nil)

    Defaults to the agent class concurrency



154
155
156
157
158
159
160
161
# File 'lib/llm/agent.rb', line 154

def initialize(llm, params = {})
  defaults = {model: self.class.model, tools: self.class.tools, skills: self.class.skills, schema: self.class.schema}.compact
  @concurrency = params.delete(:concurrency) || self.class.concurrency
  @llm = llm
  tracer = params.key?(:tracer) ? params.delete(:tracer) : self.class.tracer
  @tracer = resolve_option(tracer) unless tracer.nil?
  @ctx = LLM::Context.new(llm, defaults.merge({guard: true}).merge(params))
end

Instance Attribute Details

#llmLLM::Provider (readonly)

Returns a provider

Returns:



44
45
46
# File 'lib/llm/agent.rb', line 44

def llm
  @llm
end

Class Method Details

.tools(*tools) ⇒ Array<LLM::Function>

Set or get the default tools

Parameters:

Returns:

  • (Array<LLM::Function>)

    Returns the current tools when no argument is provided



63
64
65
66
# File 'lib/llm/agent.rb', line 63

def self.tools(*tools)
  return @tools || [] if tools.empty?
  @tools = tools.flatten
end

.skills(*skills) ⇒ Array<String>?

Set or get the default skills

Parameters:

  • skills (Array<String>, nil)

    One or more skill directories

Returns:

  • (Array<String>, nil)

    Returns the current skills when no argument is provided



74
75
76
77
# File 'lib/llm/agent.rb', line 74

def self.skills(*skills)
  return @skills if skills.empty?
  @skills = skills.flatten
end

.schema(schema = nil) ⇒ #to_json?

Set or get the default schema

Parameters:

  • schema (#to_json, nil) (defaults to: nil)

    The schema

Returns:

  • (#to_json, nil)

    Returns the current schema when no argument is provided



85
86
87
88
# File 'lib/llm/agent.rb', line 85

def self.schema(schema = nil)
  return @schema if schema.nil?
  @schema = schema
end

.instructions(instructions = nil) ⇒ String?

Set or get the default instructions

Parameters:

  • instructions (String, nil) (defaults to: nil)

    The system instructions

Returns:

  • (String, nil)

    Returns the current instructions when no argument is provided



96
97
98
99
# File 'lib/llm/agent.rb', line 96

def self.instructions(instructions = nil)
  return @instructions if instructions.nil?
  @instructions = instructions
end

.concurrency(concurrency = nil) ⇒ Symbol, ...

Set or get the tool execution concurrency.

Parameters:

  • concurrency (Symbol, Array<Symbol>, nil) (defaults to: nil)

    Controls how pending tool loops are executed:

    • :call: sequential calls
    • :thread: concurrent threads
    • :task: concurrent async tasks
    • :fiber: concurrent scheduler-backed fibers
    • :ractor: concurrent Ruby ractors for class-based tools; MCP tools are not supported, and this mode is especially useful for CPU-bound tool work
    • [:thread, :ractor]: the possible concurrency strategies to wait on, in the given order. This is useful for mixed tool sets or when work may have been spawned with more than one concurrency strategy.

Returns:

  • (Symbol, Array<Symbol>, nil)


116
117
118
119
# File 'lib/llm/agent.rb', line 116

def self.concurrency(concurrency = nil)
  return @concurrency if concurrency.nil?
  @concurrency = concurrency
end

.tracer(tracer = nil, &block) ⇒ LLM::Tracer, ...

Set or get the default tracer.

When a block is provided, it is stored and evaluated lazily against the agent instance during initialization so it can build a tracer from the resolved provider.

Examples:

class Agent < LLM::Agent
  tracer { LLM::Tracer::Logger.new(llm, io: $stdout) }
end

Parameters:

Yield Returns:

Returns:



136
137
138
139
# File 'lib/llm/agent.rb', line 136

def self.tracer(tracer = nil, &block)
  return @tracer if tracer.nil? && !block
  @tracer = block || tracer
end

.model(model = nil) ⇒ String?

Set or get the default model

Parameters:

  • model (String, nil) (defaults to: nil)

    The model identifier

Returns:

  • (String, nil)

    Returns the current model when no argument is provided



52
53
54
55
# File 'lib/llm/agent.rb', line 52

def self.model(model = nil)
  return @model if model.nil?
  @model = model
end

Instance Method Details

#deserialize(**kw) ⇒ Object Also known as: restore



360
361
362
# File 'lib/llm/agent.rb', line 360

def deserialize(**kw)
  @ctx.deserialize(**kw)
end

#talk(prompt, params = {}) ⇒ LLM::Response Also known as: chat

Maintain a conversation via the chat completions API. This method immediately sends a request to the LLM and returns the response.

Examples:

llm = LLM.openai(key: ENV["KEY"])
agent = LLM::Agent.new(llm)
response = agent.talk("Hello, what is your name?")
puts response.choices[0].content

Parameters:

  • params (Hash) (defaults to: {})

    The params passed to the provider, including optional :stream, :tools, :schema etc.

  • prompt (String)

    The input prompt to be completed

Options Hash (params):

  • :tool_attempts (Integer)

    The maxinum number of tool call iterations before the agent sends in-band advisory tool errors back through the model (default 25). Set to nil to disable advisory tool-limit returns.

Returns:



179
180
181
# File 'lib/llm/agent.rb', line 179

def talk(prompt, params = {})
  run_loop(:talk, prompt, params)
end

#respond(prompt, params = {}) ⇒ LLM::Response

Note:

Not all LLM providers support this API

Maintain a conversation via the responses API. This method immediately sends a request to the LLM and returns the response.

Examples:

llm = LLM.openai(key: ENV["KEY"])
agent = LLM::Agent.new(llm)
res = agent.respond("What is the capital of France?")
puts res.output_text

Parameters:

  • params (Hash) (defaults to: {})

    The params passed to the provider, including optional :stream, :tools, :schema etc.

  • prompt (String)

    The input prompt to be completed

Options Hash (params):

  • :tool_attempts (Integer)

    The maxinum number of tool call iterations before the agent sends in-band advisory tool errors back through the model (default 25). Set to nil to disable advisory tool-limit returns.

Returns:



201
202
203
# File 'lib/llm/agent.rb', line 201

def respond(prompt, params = {})
  run_loop(:respond, prompt, params)
end

#messagesLLM::Buffer<LLM::Message>



207
208
209
# File 'lib/llm/agent.rb', line 207

def messages
  @ctx.messages
end

#functionsArray<LLM::Function>

Returns:



213
214
215
# File 'lib/llm/agent.rb', line 213

def functions
  @tracer ? @llm.with_tracer(@tracer) { @ctx.functions } : @ctx.functions
end

#returnsArray<LLM::Function::Return>

Returns:

See Also:



220
221
222
# File 'lib/llm/agent.rb', line 220

def returns
  @ctx.returns
end

#callObject

Returns:

See Also:



227
228
229
# File 'lib/llm/agent.rb', line 227

def call(...)
  @tracer ? @llm.with_tracer(@tracer) { @ctx.call(...) } : @ctx.call(...)
end

#waitArray<LLM::Function::Return>

Returns:

See Also:



234
235
236
# File 'lib/llm/agent.rb', line 234

def wait(...)
  @tracer ? @llm.with_tracer(@tracer) { @ctx.wait(...) } : @ctx.wait(...)
end

#usageLLM::Object

Returns:



240
241
242
# File 'lib/llm/agent.rb', line 240

def usage
  @ctx.usage
end

#interrupt!nil Also known as: cancel!

Interrupt the active request, if any.

Returns:

  • (nil)


247
248
249
# File 'lib/llm/agent.rb', line 247

def interrupt!
  @ctx.interrupt!
end

#prompt(&b) ⇒ LLM::Prompt Also known as: build_prompt

Parameters:

  • b (Proc)

    A block that composes messages. If it takes one argument, it receives the prompt object. Otherwise it runs in prompt context.

Returns:

See Also:



256
257
258
# File 'lib/llm/agent.rb', line 256

def prompt(&b)
  @ctx.prompt(&b)
end

#image_url(url) ⇒ LLM::Object

Returns a tagged object

Parameters:

  • url (String)

    The URL

Returns:



266
267
268
# File 'lib/llm/agent.rb', line 266

def image_url(url)
  @ctx.image_url(url)
end

#local_file(path) ⇒ LLM::Object

Returns a tagged object

Parameters:

  • path (String)

    The path

Returns:



275
276
277
# File 'lib/llm/agent.rb', line 275

def local_file(path)
  @ctx.local_file(path)
end

#remote_file(res) ⇒ LLM::Object

Returns a tagged object

Parameters:

Returns:



284
285
286
# File 'lib/llm/agent.rb', line 284

def remote_file(res)
  @ctx.remote_file(res)
end

#tracerLLM::Tracer

Returns an LLM tracer

Returns:



291
292
293
# File 'lib/llm/agent.rb', line 291

def tracer
  @tracer || @ctx.tracer
end

#modelString

Returns the model an Agent is actively using

Returns:

  • (String)


298
299
300
# File 'lib/llm/agent.rb', line 298

def model
  @ctx.model
end

#modeSymbol

Returns:

  • (Symbol)


304
305
306
# File 'lib/llm/agent.rb', line 304

def mode
  @ctx.mode
end

#concurrencySymbol, ...

Returns the configured tool execution concurrency.

Returns:

  • (Symbol, Array<Symbol>, nil)


311
312
313
# File 'lib/llm/agent.rb', line 311

def concurrency
  @concurrency
end

#costLLM::Cost

Returns:

See Also:



318
319
320
# File 'lib/llm/agent.rb', line 318

def cost
  @ctx.cost
end

#context_windowInteger

Returns:

  • (Integer)

See Also:



325
326
327
# File 'lib/llm/agent.rb', line 325

def context_window
  @ctx.context_window
end

#to_hHash

Returns:

  • (Hash)

See Also:



332
333
334
# File 'lib/llm/agent.rb', line 332

def to_h
  @ctx.to_h
end

#to_jsonString

Returns:

  • (String)


338
339
340
# File 'lib/llm/agent.rb', line 338

def to_json(...)
  to_h.to_json(...)
end

#inspectString

Returns:

  • (String)


344
345
346
347
# File 'lib/llm/agent.rb', line 344

def inspect
  "#<#{self.class.name}:0x#{object_id.to_s(16)} " \
  "@llm=#{@llm.class}, @mode=#{mode.inspect}, @messages=#{messages.inspect}>"
end

#serialize(**kw) ⇒ void Also known as: save

This method returns an undefined value.



352
353
354
# File 'lib/llm/agent.rb', line 352

def serialize(**kw)
  @ctx.serialize(**kw)
end