Class: LLM::Chat
- Inherits:
-
Object
- Object
- LLM::Chat
- Defined in:
- lib/llm/chat.rb
Overview
LLM::Chat provides a chat object that maintains a thread of messages that acts as context throughout a conversation. A conversation can use the chat completions API that most LLM providers support or the responses API that a select few LLM providers support.
Instance Attribute Summary collapse
-
#messages ⇒ Array<LLM::Message>
readonly
Instance Method Summary collapse
-
#last_message(role: @provider.assistant_role) ⇒ LLM::Message
(also: #recent_message, #read_response)
The last message in the conversation.
-
#chat(prompt, role = :user, **params) ⇒ LLM::Chat
Maintain a conversation via the chat completions API.
-
#respond(prompt, role = :user, **params) ⇒ LLM::Chat
Maintain a conversation via the responses API.
-
#initialize(provider, params = {}) ⇒ Chat
constructor
A new instance of Chat.
-
#lazy ⇒ LLM::Chat
Enables lazy mode for the conversation.
-
#lazy? ⇒ Boolean
Returns true if the conversation is lazy.
-
#inspect ⇒ Object
Constructor Details
#initialize(provider, params = {}) ⇒ Chat
Returns a new instance of Chat.
32 33 34 35 36 37 |
# File 'lib/llm/chat.rb', line 32 def initialize(provider, params = {}) @provider = provider @params = params @lazy = false @messages = [] end |
Instance Attribute Details
#messages ⇒ Array<LLM::Message> (readonly)
25 26 27 |
# File 'lib/llm/chat.rb', line 25 def @messages end |
Instance Method Details
#last_message(role: @provider.assistant_role) ⇒ LLM::Message Also known as: recent_message, read_response
The read_response
and recent_message
methods are aliases of
the last_message
method, and you can choose the name that best
fits your context or code style.
The last message in the conversation.
83 84 85 |
# File 'lib/llm/chat.rb', line 83 def (role: @provider.assistant_role) .reverse_each.find { _1.role == role.to_s } end |
#chat(prompt, role = :user, **params) ⇒ LLM::Chat
Maintain a conversation via the chat completions API
45 46 47 48 49 50 51 52 53 54 |
# File 'lib/llm/chat.rb', line 45 def chat(prompt, role = :user, **params) if lazy? @messages << [LLM::Message.new(role, prompt), @params.merge(params), :complete] self else completion = complete!(prompt, role, params) @messages.concat [Message.new(role, prompt), completion.choices[0]] self end end |
#respond(prompt, role = :user, **params) ⇒ LLM::Chat
Not all LLM providers support this API
Maintain a conversation via the responses API
63 64 65 66 67 68 69 70 71 72 |
# File 'lib/llm/chat.rb', line 63 def respond(prompt, role = :user, **params) if lazy? @messages << [LLM::Message.new(role, prompt), @params.merge(params), :respond] self else @response = respond!(prompt, role, params) @messages.concat [Message.new(role, prompt), @response.outputs[0]] self end end |
#lazy ⇒ LLM::Chat
Enables lazy mode for the conversation.
92 93 94 95 96 97 98 |
# File 'lib/llm/chat.rb', line 92 def lazy tap do next if lazy? @lazy = true @messages = LLM::Buffer.new(@provider) end end |
#lazy? ⇒ Boolean
Returns true if the conversation is lazy
103 104 105 |
# File 'lib/llm/chat.rb', line 103 def lazy? @lazy end |
#inspect ⇒ Object
107 108 109 110 111 |
# File 'lib/llm/chat.rb', line 107 def inspect "#<#{self.class.name}:0x#{object_id.to_s(16)} " \ "@provider=#{@provider.class}, @params=#{@params.inspect}, " \ "@messages=#{@messages.inspect}, @lazy=#{@lazy.inspect}>" end |