Class: LLM::Chat
- Inherits:
-
Object
- Object
- LLM::Chat
- Defined in:
- lib/llm/chat.rb,
lib/llm/chat/builder.rb,
lib/llm/chat/conversable.rb
Overview
LLM::Chat provides a chat object that maintains a thread of messages that acts as context throughout a conversation. A conversation can use the chat completions API that most LLM providers support or the responses API that a select few LLM providers support.
Defined Under Namespace
Modules: Prompt
Constant Summary
Constants included from LLM
Instance Attribute Summary collapse
-
#messages ⇒ Array<LLM::Message>
readonly
Instance Method Summary collapse
-
#lazy ⇒ LLM::Chat
Enables lazy mode for the conversation.
-
#chat(prompt = nil, params = {}) {|prompt| ... } ⇒ LLM::Chat, ...
Maintain a conversation via the chat completions API.
-
#respond(prompt = nil, params = {}) ⇒ LLM::Chat, ...
Maintain a conversation via the responses API.
-
#initialize(provider, params = {}) ⇒ Chat
constructor
A new instance of Chat.
-
#lazy? ⇒ Boolean
Returns true if the conversation is lazy.
-
#inspect ⇒ String
-
#functions ⇒ Array<LLM::Function>
Returns an array of functions that can be called.
Methods included from LLM
File, anthropic, function, functions, gemini, llamacpp, ollama, openai, voyageai
Constructor Details
#initialize(provider, params = {}) ⇒ Chat
Returns a new instance of Chat.
58 59 60 61 62 63 |
# File 'lib/llm/chat.rb', line 58 def initialize(provider, params = {}) @provider = provider @params = {model: provider.default_model, schema: nil}.compact.merge!(params) @lazy = false @messages = [].extend(Array) end |
Instance Attribute Details
#messages ⇒ Array<LLM::Message> (readonly)
46 47 48 |
# File 'lib/llm/chat.rb', line 46 def @messages end |
Instance Method Details
#lazy ⇒ LLM::Chat
Enables lazy mode for the conversation.
106 107 108 109 110 111 112 |
# File 'lib/llm/chat.rb', line 106 def lazy tap do next if lazy? @lazy = true @messages = LLM::Buffer.new(@provider) end end |
#chat(prompt = nil, params = {}) {|prompt| ... } ⇒ LLM::Chat, ...
Maintain a conversation via the chat completions API
72 73 74 75 76 77 78 79 80 81 82 |
# File 'lib/llm/chat.rb', line 72 def chat(prompt = nil, params = {}) if block_given? yield Prompt::Completion.new(self) elsif prompt.nil? raise ArgumentError, "wrong number of arguments (given 0, expected 1)" else params = {role: :user}.merge!(params) tap { lazy? ? async_completion(prompt, params) : sync_completion(prompt, params) } end end |
#respond(prompt = nil, params = {}) ⇒ LLM::Chat, ...
Not all LLM providers support this API
Maintain a conversation via the responses API
91 92 93 94 95 96 97 98 99 100 101 |
# File 'lib/llm/chat.rb', line 91 def respond(prompt = nil, params = {}) if block_given? yield Prompt::Respond.new(self) elsif prompt.nil? raise ArgumentError, "wrong number of arguments (given 0, expected 1)" else params = {role: :user}.merge!(params) tap { lazy? ? async_response(prompt, params) : sync_response(prompt, params) } end end |
#lazy? ⇒ Boolean
Returns true if the conversation is lazy
117 118 119 |
# File 'lib/llm/chat.rb', line 117 def lazy? @lazy end |
#inspect ⇒ String
123 124 125 126 127 |
# File 'lib/llm/chat.rb', line 123 def inspect "#<#{self.class.name}:0x#{object_id.to_s(16)} " \ "@provider=#{@provider.class}, @params=#{@params.inspect}, " \ "@messages=#{@messages.inspect}, @lazy=#{@lazy.inspect}>" end |
#functions ⇒ Array<LLM::Function>
Returns an array of functions that can be called
132 133 134 135 136 137 |
# File 'lib/llm/chat.rb', line 132 def functions .select(&:assistant?) .flat_map(&:functions) .select(&:pending?) end |