Class: LLM::Bot

Inherits:
Object
  • Object
show all
Defined in:
lib/llm/bot.rb,
lib/llm/bot/builder.rb,
lib/llm/bot/conversable.rb

Overview

LLM::Bot provides a bot object that can maintain a a conversation. A conversation can use the chat completions API that all LLM providers support or the responses API that a select few LLM providers support.

Examples:

example #1

#!/usr/bin/env ruby
require "llm"

llm  = LLM.openai(ENV["KEY"])
bot  = LLM::Bot.new(llm)
msgs = bot.chat do |prompt|
  prompt.system "Answer the following questions."
  prompt.user "What is 5 + 7 ?"
  prompt.user "Why is the sky blue ?"
  prompt.user "Why did the chicken cross the road ?"
end
msgs.each { print "[#{_1.role}]", _1.content, "\n" }

example #2

#!/usr/bin/env ruby
require "llm"

llm = LLM.openai(ENV["KEY"])
bot = LLM::Bot.new(llm)
bot.chat "Answer the following questions.", role: :system
bot.chat "What is 5 + 7 ?", role: :user
bot.chat "Why is the sky blue ?", role: :user
bot.chat "Why did the chicken cross the road ?", role: :user
bot.messages.each { print "[#{_1.role}]", _1.content, "\n" }

Defined Under Namespace

Modules: Prompt

Constant Summary

Constants included from LLM

FormatError, PromptError, RateLimitError, UnauthorizedError, VERSION

Instance Attribute Summary collapse

Instance Method Summary collapse

Methods included from LLM

File, anthropic, deepseek, function, functions, gemini, llamacpp, ollama, openai, voyageai

Constructor Details

#initialize(provider, params = {}) ⇒ Bot

Returns a new instance of Bot.

Parameters:

  • provider (LLM::Provider)

    A provider

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Options Hash (params):

  • :model (String)

    Defaults to the provider’s default model

  • :schema (#to_json, nil)

    Defaults to nil

  • :tools (Array<LLM::Function>, nil)

    Defaults to nil



59
60
61
62
63
# File 'lib/llm/bot.rb', line 59

def initialize(provider, params = {})
  @provider = provider
  @params = {model: provider.default_model, schema: nil}.compact.merge!(params)
  @messages = LLM::Buffer.new(provider)
end

Instance Attribute Details

#messagesLLM::Buffer<LLM::Message> (readonly)

Returns an Enumerable for the messages in a conversation

Returns:



47
48
49
# File 'lib/llm/bot.rb', line 47

def messages
  @messages
end

Instance Method Details

#chat(prompt, params = {}) ⇒ LLM::Bot #chat(prompt, params) { ... } ⇒ LLM::Buffer

Maintain a conversation via the chat completions API

Overloads:

  • #chat(prompt, params = {}) ⇒ LLM::Bot

    Returns self

    Parameters:

    • params (defaults to: {})

      The params

    • prompt (String)

      The input prompt to be completed

    Returns:

  • #chat(prompt, params) { ... } ⇒ LLM::Buffer

    Returns messages

    Parameters:

    • params

      The params

    • prompt (String)

      The input prompt to be completed

    Yields:

    • prompt Yields a prompt

    Returns:

    • (LLM::Buffer)

      Returns messages



78
79
80
81
82
83
84
85
86
87
88
89
# File 'lib/llm/bot.rb', line 78

def chat(prompt = nil, params = {})
  if block_given?
    params = prompt
    yield Prompt::Completion.new(self, params)
    messages
  elsif prompt.nil?
    raise ArgumentError, "wrong number of arguments (given 0, expected 1)"
  else
    params = {role: :user}.merge!(params)
    tap { async_completion(prompt, params) }
  end
end

#respond(prompt, params = {}) ⇒ LLM::Bot #respond(prompt, params) { ... } ⇒ LLM::Buffer

Maintain a conversation via the responses API

Overloads:

  • #respond(prompt, params = {}) ⇒ LLM::Bot

    Returns self

    Parameters:

    • params (defaults to: {})

      The params

    • prompt (String)

      The input prompt to be completed

    Returns:

  • #respond(prompt, params) { ... } ⇒ LLM::Buffer
    Note:

    Not all LLM providers support this API

    Returns messages

    Parameters:

    • params

      The params

    • prompt (String)

      The input prompt to be completed

    Yields:

    • prompt Yields a prompt

    Returns:

    • (LLM::Buffer)

      Returns messages



105
106
107
108
109
110
111
112
113
114
115
116
# File 'lib/llm/bot.rb', line 105

def respond(prompt = nil, params = {})
  if block_given?
    params = prompt
    yield Prompt::Respond.new(self, params)
    messages
  elsif prompt.nil?
    raise ArgumentError, "wrong number of arguments (given 0, expected 1)"
  else
    params = {role: :user}.merge!(params)
    tap { async_response(prompt, params) }
  end
end

#inspectString

Returns:

  • (String)


120
121
122
123
124
# File 'lib/llm/bot.rb', line 120

def inspect
  "#<#{self.class.name}:0x#{object_id.to_s(16)} " \
  "@provider=#{@provider.class}, @params=#{@params.inspect}, " \
  "@messages=#{@messages.inspect}>"
end

#functionsArray<LLM::Function>

Returns an array of functions that can be called

Returns:



129
130
131
132
133
134
# File 'lib/llm/bot.rb', line 129

def functions
  messages
    .select(&:assistant?)
    .flat_map(&:functions)
    .select(&:pending?)
end