Class: LLM::Agent
- Inherits:
-
Object
- Object
- LLM::Agent
- Defined in:
- lib/llm/agent.rb
Overview
LLM::Agent provides a class-level DSL for defining reusable, preconfigured assistants with defaults for model, tools, schema, and instructions.
It wraps the same stateful runtime surface as LLM::Context: message history,
usage, persistence, streaming parameters, and provider-backed
requests still flow through an underlying context. The defining
behavior of an agent is that it automatically resolves pending tool
calls for you during talk and respond,
instead of leaving tool loops to the caller.
Notes:
- Instructions are injected once unless a system message is already present.
- An agent automatically executes tool loops (unlike LLM::Context).
- The automatic tool loop enables the wrapped context's
guardby default. The built-in LLM::LoopGuard detects repeated tool-call patterns and blocks stuck execution before more tool work is queued. - The default tool attempt budget is
25. After that, the agent sends advisory tool errors back through the model and keeps the loop in-band. Settool_attempts: nilto disable that advisory behavior. - Tool loop execution can be configured with
concurrency :call,:thread,:task,:fiber,:ractor, or a list of queued task types such as[:thread, :ractor].
Instance Attribute Summary collapse
-
#llm ⇒
LLM::Provider readonly
Returns a provider.
Class Method Summary collapse
-
.tools(*tools) ⇒
Array<LLM::Function>
Set or get the default tools.
-
.skills(*skills) ⇒
Array<String>?
Set or get the default skills.
-
.schema(schema = nil) ⇒
#to_json?
Set or get the default schema.
-
.instructions(instructions
= nil) ⇒ String?
Set or get the default instructions.
-
.concurrency(concurrency
= nil) ⇒ Symbol, ...
Set or get the tool execution concurrency.
-
.tracer(tracer = nil,
&block) ⇒ LLM::Tracer, ...
Set or get the default tracer.
-
.model(model = nil) ⇒
String?
Set or get the default model.
Instance Method Summary collapse
- #deserialize(**kw) ⇒ Object (also: #restore)
-
#initialize(llm,
params = {}) ⇒ Agent constructor
A new instance of Agent.
-
#talk(prompt, params
= {}) ⇒ LLM::Response (also: #chat)
Maintain a conversation via the chat completions API.
-
#respond(prompt,
params = {}) ⇒ LLM::Response
Maintain a conversation via the responses API.
- #messages ⇒ LLM::Buffer<LLM::Message>
- #functions ⇒ Array<LLM::Function>
- #returns ⇒ Array<LLM::Function::Return>
- #call ⇒ Object
- #wait ⇒ Array<LLM::Function::Return>
- #usage ⇒ LLM::Object
-
#interrupt! ⇒
nil (also: #cancel!)
Interrupt the active request, if any.
- #prompt(&b) ⇒ LLM::Prompt (also: #build_prompt)
-
#image_url(url)
⇒ LLM::Object
Returns a tagged object.
-
#local_file(path) ⇒
LLM::Object
Returns a tagged object.
-
#remote_file(res) ⇒
LLM::Object
Returns a tagged object.
-
#tracer ⇒
LLM::Tracer
Returns an LLM tracer.
-
#model ⇒
String
Returns the model an Agent is actively using.
- #mode ⇒ Symbol
-
#concurrency ⇒
Symbol, ...
Returns the configured tool execution concurrency.
- #cost ⇒ LLM::Cost
- #context_window ⇒ Integer
- #to_h ⇒ Hash
- #to_json ⇒ String
- #inspect ⇒ String
- #serialize(**kw) ⇒ void (also: #save)
Constructor Details
#initialize(llm, params = {}) ⇒ Agent
Returns a new instance of Agent.
154 155 156 157 158 159 160 161 |
# File 'lib/llm/agent.rb', line 154 def initialize(llm, params = {}) defaults = {model: self.class.model, tools: self.class.tools, skills: self.class.skills, schema: self.class.schema}.compact @concurrency = params.delete(:concurrency) || self.class.concurrency @llm = llm tracer = params.key?(:tracer) ? params.delete(:tracer) : self.class.tracer @tracer = resolve_option(tracer) unless tracer.nil? @ctx = LLM::Context.new(llm, defaults.merge({guard: true}).merge(params)) end |
Instance Attribute Details
#llm ⇒ LLM::Provider (readonly)
Returns a provider
44 45 46 |
# File 'lib/llm/agent.rb', line 44 def llm @llm end |
Class Method Details
.tools(*tools) ⇒ Array<LLM::Function>
Set or get the default tools
63 64 65 66 |
# File 'lib/llm/agent.rb', line 63 def self.tools(*tools) return @tools || [] if tools.empty? @tools = tools.flatten end |
.skills(*skills) ⇒ Array<String>?
Set or get the default skills
74 75 76 77 |
# File 'lib/llm/agent.rb', line 74 def self.skills(*skills) return @skills if skills.empty? @skills = skills.flatten end |
.schema(schema = nil) ⇒ #to_json?
Set or get the default schema
85 86 87 88 |
# File 'lib/llm/agent.rb', line 85 def self.schema(schema = nil) return @schema if schema.nil? @schema = schema end |
.instructions(instructions = nil) ⇒ String?
Set or get the default instructions
96 97 98 99 |
# File 'lib/llm/agent.rb', line 96 def self.instructions(instructions = nil) return @instructions if instructions.nil? @instructions = instructions end |
.concurrency(concurrency = nil) ⇒ Symbol, ...
Set or get the tool execution concurrency.
116 117 118 119 |
# File 'lib/llm/agent.rb', line 116 def self.concurrency(concurrency = nil) return @concurrency if concurrency.nil? @concurrency = concurrency end |
.tracer(tracer = nil, &block) ⇒ LLM::Tracer, ...
Set or get the default tracer.
When a block is provided, it is stored and evaluated lazily against the agent instance during initialization so it can build a tracer from the resolved provider.
136 137 138 139 |
# File 'lib/llm/agent.rb', line 136 def self.tracer(tracer = nil, &block) return @tracer if tracer.nil? && !block @tracer = block || tracer end |
.model(model = nil) ⇒ String?
Set or get the default model
52 53 54 55 |
# File 'lib/llm/agent.rb', line 52 def self.model(model = nil) return @model if model.nil? @model = model end |
Instance Method Details
#deserialize(**kw) ⇒ Object Also known as: restore
360 361 362 |
# File 'lib/llm/agent.rb', line 360 def deserialize(**kw) @ctx.deserialize(**kw) end |
#talk(prompt, params = {}) ⇒ LLM::Response Also known as: chat
Maintain a conversation via the chat completions API. This method immediately sends a request to the LLM and returns the response.
179 180 181 |
# File 'lib/llm/agent.rb', line 179 def talk(prompt, params = {}) run_loop(:talk, prompt, params) end |
#respond(prompt, params = {}) ⇒ LLM::Response
Not all LLM providers support this API
Maintain a conversation via the responses API. This method immediately sends a request to the LLM and returns the response.
201 202 203 |
# File 'lib/llm/agent.rb', line 201 def respond(prompt, params = {}) run_loop(:respond, prompt, params) end |
#functions ⇒ Array<LLM::Function>
213 214 215 |
# File 'lib/llm/agent.rb', line 213 def functions @tracer ? @llm.with_tracer(@tracer) { @ctx.functions } : @ctx.functions end |
#returns ⇒ Array<LLM::Function::Return>
220 221 222 |
# File 'lib/llm/agent.rb', line 220 def returns @ctx.returns end |
#call ⇒ Object
227 228 229 |
# File 'lib/llm/agent.rb', line 227 def call(...) @tracer ? @llm.with_tracer(@tracer) { @ctx.call(...) } : @ctx.call(...) end |
#wait ⇒ Array<LLM::Function::Return>
234 235 236 |
# File 'lib/llm/agent.rb', line 234 def wait(...) @tracer ? @llm.with_tracer(@tracer) { @ctx.wait(...) } : @ctx.wait(...) end |
#interrupt! ⇒ nil Also known as: cancel!
Interrupt the active request, if any.
247 248 249 |
# File 'lib/llm/agent.rb', line 247 def interrupt! @ctx.interrupt! end |
#prompt(&b) ⇒ LLM::Prompt Also known as: build_prompt
256 257 258 |
# File 'lib/llm/agent.rb', line 256 def prompt(&b) @ctx.prompt(&b) end |
#image_url(url) ⇒ LLM::Object
Returns a tagged object
266 267 268 |
# File 'lib/llm/agent.rb', line 266 def image_url(url) @ctx.image_url(url) end |
#local_file(path) ⇒ LLM::Object
Returns a tagged object
275 276 277 |
# File 'lib/llm/agent.rb', line 275 def local_file(path) @ctx.local_file(path) end |
#remote_file(res) ⇒ LLM::Object
Returns a tagged object
284 285 286 |
# File 'lib/llm/agent.rb', line 284 def remote_file(res) @ctx.remote_file(res) end |
#tracer ⇒ LLM::Tracer
Returns an LLM tracer
291 292 293 |
# File 'lib/llm/agent.rb', line 291 def tracer @tracer || @ctx.tracer end |
#model ⇒ String
Returns the model an Agent is actively using
298 299 300 |
# File 'lib/llm/agent.rb', line 298 def model @ctx.model end |
#mode ⇒ Symbol
304 305 306 |
# File 'lib/llm/agent.rb', line 304 def mode @ctx.mode end |
#concurrency ⇒ Symbol, ...
Returns the configured tool execution concurrency.
311 312 313 |
# File 'lib/llm/agent.rb', line 311 def concurrency @concurrency end |
#context_window ⇒ Integer
325 326 327 |
# File 'lib/llm/agent.rb', line 325 def context_window @ctx.context_window end |
#to_h ⇒ Hash
332 333 334 |
# File 'lib/llm/agent.rb', line 332 def to_h @ctx.to_h end |
#to_json ⇒ String
338 339 340 |
# File 'lib/llm/agent.rb', line 338 def to_json(...) to_h.to_json(...) end |
#inspect ⇒ String
344 345 346 347 |
# File 'lib/llm/agent.rb', line 344 def inspect "#<#{self.class.name}:0x#{object_id.to_s(16)} " \ "@llm=#{@llm.class}, @mode=#{mode.inspect}, @messages=#{.inspect}>" end |
#serialize(**kw) ⇒ void Also known as: save
This method returns an undefined value.
352 353 354 |
# File 'lib/llm/agent.rb', line 352 def serialize(**kw) @ctx.serialize(**kw) end |