Class: LLM::Agent
- Inherits:
-
Object
- Object
- LLM::Agent
- Defined in:
- lib/llm/agent.rb
Overview
LLM::Agent provides a class-level DSL for defining reusable, preconfigured assistants with defaults for model, tools, schema, and instructions.
Notes:
- Instructions are injected only on the first request.
- An agent will automatically execute tool calls (unlike LLM::Session).
- The idea originally came from RubyLLM and was adapted to llm.rb.
Class Method Summary collapse
-
.tools(*tools) ⇒
Array<LLM::Function>
Set or get the default tools.
-
.schema(schema = nil) ⇒
#to_json?
Set or get the default schema.
-
.instructions(instructions
= nil) ⇒ String?
Set or get the default instructions.
-
.model(model = nil) ⇒
String?
Set or get the default model.
Instance Method Summary collapse
-
#model ⇒
String
Returns the model an Agent is actively using.
-
#initialize(provider,
params = {}) ⇒ Agent constructor
A new instance of Agent.
-
#talk(prompt, params
= {}) ⇒ LLM::Response (also: #chat)
Maintain a conversation via the chat completions API.
-
#respond(prompt,
params = {}) ⇒ LLM::Response
Maintain a conversation via the responses API.
- #messages ⇒ LLM::Buffer<LLM::Message>
- #functions ⇒ Array<LLM::Function>
- #usage ⇒ LLM::Object
- #prompt(&b) ⇒ LLM::Prompt (also: #build_prompt)
-
#image_url(url)
⇒ LLM::Object
Returns a tagged object.
-
#local_file(path) ⇒
LLM::Object
Returns a tagged object.
-
#remote_file(res) ⇒
LLM::Object
Returns a tagged object.
-
#tracer ⇒
LLM::Tracer
Returns an LLM tracer.
Constructor Details
#initialize(provider, params = {}) ⇒ Agent
Returns a new instance of Agent.
80 81 82 83 84 85 |
# File 'lib/llm/agent.rb', line 80 def initialize(provider, params = {}) defaults = {model: self.class.model, tools: self.class.tools, schema: self.class.schema}.compact @provider = provider @ses = LLM::Session.new(provider, defaults.merge(params)) @instructions_applied = false end |
Class Method Details
.tools(*tools) ⇒ Array<LLM::Function>
Set or get the default tools
43 44 45 46 |
# File 'lib/llm/agent.rb', line 43 def self.tools(*tools) return @tools || [] if tools.empty? @tools = tools.flatten end |
.schema(schema = nil) ⇒ #to_json?
Set or get the default schema
54 55 56 57 |
# File 'lib/llm/agent.rb', line 54 def self.schema(schema = nil) return @schema if schema.nil? @schema = schema end |
.instructions(instructions = nil) ⇒ String?
Set or get the default instructions
65 66 67 68 |
# File 'lib/llm/agent.rb', line 65 def self.instructions(instructions = nil) return @instructions if instructions.nil? @instructions = instructions end |
.model(model = nil) ⇒ String?
Set or get the default model
32 33 34 35 |
# File 'lib/llm/agent.rb', line 32 def self.model(model = nil) return @model if model.nil? @model = model end |
Instance Method Details
#model ⇒ String
Returns the model an Agent is actively using
203 204 205 |
# File 'lib/llm/agent.rb', line 203 def model @ses.model end |
#talk(prompt, params = {}) ⇒ LLM::Response Also known as: chat
Maintain a conversation via the chat completions API. This method immediately sends a request to the LLM and returns the response.
100 101 102 103 104 105 106 107 108 109 110 |
# File 'lib/llm/agent.rb', line 100 def talk(prompt, params = {}) i, max = 0, Integer(params.delete(:max_tool_rounds) || 10) res = @ses.talk(apply_instructions(prompt), params) until @ses.functions.empty? raise LLM::ToolLoopError, "pending tool calls remain" if i >= max res = @ses.talk @ses.functions.map(&:call), params i += 1 end @instructions_applied = true res end |
#respond(prompt, params = {}) ⇒ LLM::Response
Not all LLM providers support this API
Maintain a conversation via the responses API. This method immediately sends a request to the LLM and returns the response.
127 128 129 130 131 132 133 134 135 136 137 |
# File 'lib/llm/agent.rb', line 127 def respond(prompt, params = {}) i, max = 0, Integer(params.delete(:max_tool_rounds) || 10) res = @ses.respond(apply_instructions(prompt), params) until @ses.functions.empty? raise LLM::ToolLoopError, "pending tool calls remain" if i >= max res = @ses.respond @ses.functions.map(&:call), params i += 1 end @instructions_applied = true res end |
#functions ⇒ Array<LLM::Function>
147 148 149 |
# File 'lib/llm/agent.rb', line 147 def functions @ses.functions end |
#prompt(&b) ⇒ LLM::Prompt Also known as: build_prompt
161 162 163 |
# File 'lib/llm/agent.rb', line 161 def prompt(&b) @ses.prompt(&b) end |
#image_url(url) ⇒ LLM::Object
Returns a tagged object
171 172 173 |
# File 'lib/llm/agent.rb', line 171 def image_url(url) @ses.image_url(url) end |
#local_file(path) ⇒ LLM::Object
Returns a tagged object
180 181 182 |
# File 'lib/llm/agent.rb', line 180 def local_file(path) @ses.local_file(path) end |
#remote_file(res) ⇒ LLM::Object
Returns a tagged object
189 190 191 |
# File 'lib/llm/agent.rb', line 189 def remote_file(res) @ses.remote_file(res) end |
#tracer ⇒ LLM::Tracer
Returns an LLM tracer
196 197 198 |
# File 'lib/llm/agent.rb', line 196 def tracer @ses.tracer end |