Class: LLM::Agent
- Inherits:
-
Object
- Object
- LLM::Agent
- Defined in:
- lib/llm/agent.rb
Overview
LLM::Agent provides a class-level DSL for defining reusable, preconfigured assistants with defaults for model, tools, schema, and instructions.
Notes:
- Instructions are injected only on the first request.
- An agent will automatically execute tool calls (unlike LLM::Session).
- The idea originally came from RubyLLM and was adapted to llm.rb.
Instance Attribute Summary collapse
-
#llm ⇒
LLM::Provider readonly
Returns a provider.
Class Method Summary collapse
-
.tools(*tools) ⇒
Array<LLM::Function>
Set or get the default tools.
-
.schema(schema = nil) ⇒
#to_json?
Set or get the default schema.
-
.instructions(instructions
= nil) ⇒ String?
Set or get the default instructions.
-
.model(model = nil) ⇒
String?
Set or get the default model.
Instance Method Summary collapse
- #deserialize(**kw) ⇒ LLM::Session (also: #restore)
-
#initialize(llm,
params = {}) ⇒ Agent constructor
A new instance of Agent.
-
#talk(prompt, params
= {}) ⇒ LLM::Response (also: #chat)
Maintain a conversation via the chat completions API.
-
#respond(prompt,
params = {}) ⇒ LLM::Response
Maintain a conversation via the responses API.
- #messages ⇒ LLM::Buffer<LLM::Message>
- #functions ⇒ Array<LLM::Function>
- #usage ⇒ LLM::Object
- #prompt(&b) ⇒ LLM::Prompt (also: #build_prompt)
-
#image_url(url)
⇒ LLM::Object
Returns a tagged object.
-
#local_file(path) ⇒
LLM::Object
Returns a tagged object.
-
#remote_file(res) ⇒
LLM::Object
Returns a tagged object.
-
#tracer ⇒
LLM::Tracer
Returns an LLM tracer.
-
#model ⇒
String
Returns the model an Agent is actively using.
- #serialize(**kw) ⇒ void (also: #save)
Constructor Details
#initialize(llm, params = {}) ⇒ Agent
Returns a new instance of Agent.
85 86 87 88 89 |
# File 'lib/llm/agent.rb', line 85 def initialize(llm, params = {}) defaults = {model: self.class.model, tools: self.class.tools, schema: self.class.schema}.compact @llm = llm @ses = LLM::Session.new(llm, defaults.merge(params)) end |
Instance Attribute Details
#llm ⇒ LLM::Provider (readonly)
Returns a provider
29 30 31 |
# File 'lib/llm/agent.rb', line 29 def llm @llm end |
Class Method Details
.tools(*tools) ⇒ Array<LLM::Function>
Set or get the default tools
48 49 50 51 |
# File 'lib/llm/agent.rb', line 48 def self.tools(*tools) return @tools || [] if tools.empty? @tools = tools.flatten end |
.schema(schema = nil) ⇒ #to_json?
Set or get the default schema
59 60 61 62 |
# File 'lib/llm/agent.rb', line 59 def self.schema(schema = nil) return @schema if schema.nil? @schema = schema end |
.instructions(instructions = nil) ⇒ String?
Set or get the default instructions
70 71 72 73 |
# File 'lib/llm/agent.rb', line 70 def self.instructions(instructions = nil) return @instructions if instructions.nil? @instructions = instructions end |
.model(model = nil) ⇒ String?
Set or get the default model
37 38 39 40 |
# File 'lib/llm/agent.rb', line 37 def self.model(model = nil) return @model if model.nil? @model = model end |
Instance Method Details
#deserialize(**kw) ⇒ LLM::Session Also known as: restore
220 221 222 |
# File 'lib/llm/agent.rb', line 220 def deserialize(**kw) @ses.deserialize(**kw) end |
#talk(prompt, params = {}) ⇒ LLM::Response Also known as: chat
Maintain a conversation via the chat completions API. This method immediately sends a request to the LLM and returns the response.
104 105 106 107 108 109 110 111 112 113 |
# File 'lib/llm/agent.rb', line 104 def talk(prompt, params = {}) i, max = 0, Integer(params.delete(:max_tool_rounds) || 10) res = @ses.talk(apply_instructions(prompt), params) until @ses.functions.empty? raise LLM::ToolLoopError, "pending tool calls remain" if i >= max res = @ses.talk @ses.functions.map(&:call), params i += 1 end res end |
#respond(prompt, params = {}) ⇒ LLM::Response
Not all LLM providers support this API
Maintain a conversation via the responses API. This method immediately sends a request to the LLM and returns the response.
130 131 132 133 134 135 136 137 138 139 |
# File 'lib/llm/agent.rb', line 130 def respond(prompt, params = {}) i, max = 0, Integer(params.delete(:max_tool_rounds) || 10) res = @ses.respond(apply_instructions(prompt), params) until @ses.functions.empty? raise LLM::ToolLoopError, "pending tool calls remain" if i >= max res = @ses.respond @ses.functions.map(&:call), params i += 1 end res end |
#functions ⇒ Array<LLM::Function>
149 150 151 |
# File 'lib/llm/agent.rb', line 149 def functions @ses.functions end |
#prompt(&b) ⇒ LLM::Prompt Also known as: build_prompt
163 164 165 |
# File 'lib/llm/agent.rb', line 163 def prompt(&b) @ses.prompt(&b) end |
#image_url(url) ⇒ LLM::Object
Returns a tagged object
173 174 175 |
# File 'lib/llm/agent.rb', line 173 def image_url(url) @ses.image_url(url) end |
#local_file(path) ⇒ LLM::Object
Returns a tagged object
182 183 184 |
# File 'lib/llm/agent.rb', line 182 def local_file(path) @ses.local_file(path) end |
#remote_file(res) ⇒ LLM::Object
Returns a tagged object
191 192 193 |
# File 'lib/llm/agent.rb', line 191 def remote_file(res) @ses.remote_file(res) end |
#tracer ⇒ LLM::Tracer
Returns an LLM tracer
198 199 200 |
# File 'lib/llm/agent.rb', line 198 def tracer @ses.tracer end |
#model ⇒ String
Returns the model an Agent is actively using
205 206 207 |
# File 'lib/llm/agent.rb', line 205 def model @ses.model end |
#serialize(**kw) ⇒ void Also known as: save
This method returns an undefined value.
212 213 214 |
# File 'lib/llm/agent.rb', line 212 def serialize(**kw) @ses.serialize(**kw) end |