Class: LLM::Chat

Inherits:
Object
  • Object
show all
Defined in:
lib/llm/chat.rb

Overview

LLM::Chat provides a chat object that maintains a thread of messages that acts as context throughout a conversation. A conversation can use the chat completions API that most LLM providers support or the responses API that a select few LLM providers support.

Examples:

#!/usr/bin/env ruby
require "llm"

llm = LLM.openai(ENV["KEY"])
bot = LLM::Chat.new(llm).lazy
bot.chat("Your task is to answer all of my questions", :system)
bot.chat("Your answers should be short and concise", :system)
bot.chat("What is 5 + 7 ?", :user)
bot.chat("Why is the sky blue ?", :user)
bot.chat("Why did the chicken cross the road ?", :user)
bot.messages.map { print "[#{_1.role}]", _1.content, "\n" }

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(provider, params = {}) ⇒ Chat

Returns a new instance of Chat.

Parameters:

  • provider (LLM::Provider)

    A provider

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation



32
33
34
35
36
37
# File 'lib/llm/chat.rb', line 32

def initialize(provider, params = {})
  @provider = provider
  @params = params
  @lazy = false
  @messages = []
end

Instance Attribute Details

#messagesArray<LLM::Message> (readonly)

Returns:



25
26
27
# File 'lib/llm/chat.rb', line 25

def messages
  @messages
end

Instance Method Details

#last_message(role: @provider.assistant_role) ⇒ LLM::Message Also known as: recent_message, read_response

Note:

The read_response and recent_message methods are aliases of the last_message method, and you can choose the name that best fits your context or code style.

The last message in the conversation.

Parameters:

  • role (#to_s) (defaults to: @provider.assistant_role)

    The role of the last message.

Returns:



83
84
85
# File 'lib/llm/chat.rb', line 83

def last_message(role: @provider.assistant_role)
  messages.reverse_each.find { _1.role == role.to_s }
end

#chat(prompt, role = :user, **params) ⇒ LLM::Chat

Maintain a conversation via the chat completions API

Returns:



45
46
47
48
49
50
51
52
53
54
# File 'lib/llm/chat.rb', line 45

def chat(prompt, role = :user, **params)
  if lazy?
    @messages << [LLM::Message.new(role, prompt), @params.merge(params), :complete]
    self
  else
    completion = complete!(prompt, role, params)
    @messages.concat [Message.new(role, prompt), completion.choices[0]]
    self
  end
end

#respond(prompt, role = :user, **params) ⇒ LLM::Chat

Note:

Not all LLM providers support this API

Maintain a conversation via the responses API

Returns:



63
64
65
66
67
68
69
70
71
72
# File 'lib/llm/chat.rb', line 63

def respond(prompt, role = :user, **params)
  if lazy?
    @messages << [LLM::Message.new(role, prompt), @params.merge(params), :respond]
    self
  else
    @response = respond!(prompt, role, params)
    @messages.concat [Message.new(role, prompt), @response.outputs[0]]
    self
  end
end

#lazyLLM::Chat

Enables lazy mode for the conversation.

Returns:



92
93
94
95
96
97
98
# File 'lib/llm/chat.rb', line 92

def lazy
  tap do
    next if lazy?
    @lazy = true
    @messages = LLM::Buffer.new(@provider)
  end
end

#lazy?Boolean

Returns true if the conversation is lazy

Returns:

  • (Boolean)

    Returns true if the conversation is lazy



103
104
105
# File 'lib/llm/chat.rb', line 103

def lazy?
  @lazy
end

#inspectObject



107
108
109
110
111
# File 'lib/llm/chat.rb', line 107

def inspect
  "#<#{self.class.name}:0x#{object_id.to_s(16)} " \
  "@provider=#{@provider.class}, @params=#{@params.inspect}, " \
  "@messages=#{@messages.inspect}, @lazy=#{@lazy.inspect}>"
end