Class: LLM::Chat

Inherits:
Object
  • Object
show all
Defined in:
lib/llm/chat.rb,
lib/llm/chat/builder.rb,
lib/llm/chat/conversable.rb

Overview

LLM::Chat provides a chat object that maintains a thread of messages that acts as context throughout a conversation. A conversation can use the chat completions API that most LLM providers support or the responses API that a select few LLM providers support.

Examples:

#!/usr/bin/env ruby
require "llm"

llm  = LLM.openai(ENV["KEY"])
bot  = LLM::Chat.new(llm).lazy
msgs = bot.chat do |prompt|
  prompt.system "Answer the following questions."
  prompt.user "What is 5 + 7 ?"
  prompt.user "Why is the sky blue ?"
  prompt.user "Why did the chicken cross the road ?"
end
msgs.map { print "[#{_1.role}]", _1.content, "\n" }
#!/usr/bin/env ruby
require "llm"

llm = LLM.openai(ENV["KEY"])
bot = LLM::Chat.new(llm).lazy
bot.chat "Answer the following questions.", role: :system
bot.chat "What is 5 + 7 ?", role: :user
bot.chat "Why is the sky blue ?", role: :user
bot.chat "Why did the chicken cross the road ?", role: :user
bot.messages.map { print "[#{_1.role}]", _1.content, "\n" }

Defined Under Namespace

Modules: Prompt

Constant Summary

Constants included from LLM

VERSION

Instance Attribute Summary collapse

Instance Method Summary collapse

Methods included from LLM

File, anthropic, function, functions, gemini, llamacpp, ollama, openai, voyageai

Constructor Details

#initialize(provider, params = {}) ⇒ Chat

Returns a new instance of Chat.

Parameters:

  • provider (LLM::Provider)

    A provider

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Options Hash (params):

  • :model (String)

    Defaults to the provider’s default model

  • :schema (#to_json, nil)

    Defaults to nil

  • :tools (Array<LLM::Function>, nil)

    Defaults to nil



58
59
60
61
62
63
# File 'lib/llm/chat.rb', line 58

def initialize(provider, params = {})
  @provider = provider
  @params = {model: provider.default_model, schema: nil}.compact.merge!(params)
  @lazy = false
  @messages = [].extend(Array)
end

Instance Attribute Details

#messagesArray<LLM::Message> (readonly)

Returns:



46
47
48
# File 'lib/llm/chat.rb', line 46

def messages
  @messages
end

Instance Method Details

#lazyLLM::Chat

Enables lazy mode for the conversation.

Returns:



106
107
108
109
110
111
112
# File 'lib/llm/chat.rb', line 106

def lazy
  tap do
    next if lazy?
    @lazy = true
    @messages = LLM::Buffer.new(@provider)
  end
end

#chat(prompt = nil, params = {}) {|prompt| ... } ⇒ LLM::Chat, ...

Maintain a conversation via the chat completions API

Parameters:

  • prompt (String) (defaults to: nil)

    The input prompt to be completed

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Yield Parameters:

  • prompt (LLM::Chat::CompletionPrompt)

    Yields a prompt

Returns:

  • (LLM::Chat, Array<LLM::Message>, LLM::Buffer)

    Returns self unless given a block, otherwise returns messages



72
73
74
75
76
77
78
79
80
81
82
# File 'lib/llm/chat.rb', line 72

def chat(prompt = nil, params = {})
  if block_given?
    yield Prompt::Completion.new(self)
    messages
  elsif prompt.nil?
    raise ArgumentError, "wrong number of arguments (given 0, expected 1)"
  else
    params = {role: :user}.merge!(params)
    tap { lazy? ? async_completion(prompt, params) : sync_completion(prompt, params) }
  end
end

#respond(prompt = nil, params = {}) ⇒ LLM::Chat, ...

Note:

Not all LLM providers support this API

Maintain a conversation via the responses API

Parameters:

  • prompt (String) (defaults to: nil)

    The input prompt to be completed

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Returns:

  • (LLM::Chat, Array<LLM::Message>, LLM::Buffer)

    Returns self unless given a block, otherwise returns messages



91
92
93
94
95
96
97
98
99
100
101
# File 'lib/llm/chat.rb', line 91

def respond(prompt = nil, params = {})
  if block_given?
    yield Prompt::Respond.new(self)
    messages
  elsif prompt.nil?
    raise ArgumentError, "wrong number of arguments (given 0, expected 1)"
  else
    params = {role: :user}.merge!(params)
    tap { lazy? ? async_response(prompt, params) : sync_response(prompt, params) }
  end
end

#lazy?Boolean

Returns true if the conversation is lazy

Returns:

  • (Boolean)

    Returns true if the conversation is lazy



117
118
119
# File 'lib/llm/chat.rb', line 117

def lazy?
  @lazy
end

#inspectString

Returns:

  • (String)


123
124
125
126
127
# File 'lib/llm/chat.rb', line 123

def inspect
  "#<#{self.class.name}:0x#{object_id.to_s(16)} " \
  "@provider=#{@provider.class}, @params=#{@params.inspect}, " \
  "@messages=#{@messages.inspect}, @lazy=#{@lazy.inspect}>"
end

#functionsArray<LLM::Function>

Returns an array of functions that can be called

Returns:



132
133
134
135
136
137
# File 'lib/llm/chat.rb', line 132

def functions
  messages
    .select(&:assistant?)
    .flat_map(&:functions)
    .select(&:pending?)
end