Class: LLM::Bot

Inherits:
Object
  • Object
show all
Defined in:
lib/llm/bot.rb,
lib/llm/bot/builder.rb,
lib/llm/bot/conversable.rb

Overview

LLM::Bot provides an object that can maintain a conversation. A conversation can use the chat completions API that all LLM providers support or the responses API that currently only OpenAI supports.

Examples:

#!/usr/bin/env ruby
require "llm"

llm  = LLM.openai(key: ENV["KEY"])
bot  = LLM::Bot.new(llm)
url  = "https://upload.wikimedia.org/wikipedia/commons/thumb/9/9a/Cognac_glass.jpg/500px-Cognac_glass.jpg"
msgs = bot.chat do |prompt|
  prompt.system "Your task is to answer all user queries"
  prompt.user ["Tell me about this URL", URI(url)]
  prompt.user ["Tell me about this pdf", File.open("freebsd_book.pdf", "rb")]
  prompt.user "Is the URL and PDF similar to each other?"
end

# At this point, we execute a single request
msgs.each { print "[#{_1.role}] ", _1.content, "\n" }

Defined Under Namespace

Modules: Prompt

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(provider, params = {}) ⇒ Bot

Returns a new instance of Bot.

Parameters:

  • provider (LLM::Provider)

    A provider

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Options Hash (params):

  • :model (String)

    Defaults to the provider’s default model

  • :schema (#to_json, nil)

    Defaults to nil

  • :tools (Array<LLM::Function>, nil)

    Defaults to nil



50
51
52
53
54
# File 'lib/llm/bot.rb', line 50

def initialize(provider, params = {})
  @provider = provider
  @params = {model: provider.default_model, schema: nil}.compact.merge!(params)
  @messages = LLM::Buffer.new(provider)
end

Instance Attribute Details

#messagesLLM::Buffer<LLM::Message> (readonly)

Returns an Enumerable for the messages in a conversation



38
39
40
# File 'lib/llm/bot.rb', line 38

def messages
  @messages
end

Instance Method Details

#chat(prompt, params = {}) ⇒ LLM::Bot #chat(prompt, params) { ... } ⇒ LLM::Buffer

Maintain a conversation via the chat completions API

Overloads:

  • #chat(prompt, params = {}) ⇒ LLM::Bot

    Returns self

    Parameters:

    • params (defaults to: {})

      The params

    • prompt (String)

      The input prompt to be completed

    Returns:

  • #chat(prompt, params) { ... } ⇒ LLM::Buffer

    Returns messages

    Parameters:

    • params

      The params

    • prompt (String)

      The input prompt to be completed

    Yields:

    • prompt Yields a prompt

    Returns:



69
70
71
72
73
74
75
76
77
78
79
80
# File 'lib/llm/bot.rb', line 69

def chat(prompt = nil, params = {})
  if block_given?
    params = prompt
    yield Prompt::Completion.new(self, params)
    messages
  elsif prompt.nil?
    raise ArgumentError, "wrong number of arguments (given 0, expected 1)"
  else
    params = {role: :user}.merge!(params)
    tap { async_completion(prompt, params) }
  end
end

#respond(prompt, params = {}) ⇒ LLM::Bot #respond(prompt, params) { ... } ⇒ LLM::Buffer

Maintain a conversation via the responses API

Overloads:

  • #respond(prompt, params = {}) ⇒ LLM::Bot

    Returns self

    Parameters:

    • params (defaults to: {})

      The params

    • prompt (String)

      The input prompt to be completed

    Returns:

  • #respond(prompt, params) { ... } ⇒ LLM::Buffer
    Note:

    Not all LLM providers support this API

    Returns messages

    Parameters:

    • params

      The params

    • prompt (String)

      The input prompt to be completed

    Yields:

    • prompt Yields a prompt

    Returns:



96
97
98
99
100
101
102
103
104
105
106
107
# File 'lib/llm/bot.rb', line 96

def respond(prompt = nil, params = {})
  if block_given?
    params = prompt
    yield Prompt::Respond.new(self, params)
    messages
  elsif prompt.nil?
    raise ArgumentError, "wrong number of arguments (given 0, expected 1)"
  else
    params = {role: :user}.merge!(params)
    tap { async_response(prompt, params) }
  end
end

#inspectString

Returns:

  • (String)


111
112
113
114
115
# File 'lib/llm/bot.rb', line 111

def inspect
  "#<#{self.class.name}:0x#{object_id.to_s(16)} " \
  "@provider=#{@provider.class}, @params=#{@params.inspect}, " \
  "@messages=#{@messages.inspect}>"
end

#functionsArray<LLM::Function>

Returns an array of functions that can be called

Returns:



120
121
122
123
124
125
# File 'lib/llm/bot.rb', line 120

def functions
  messages
    .select(&:assistant?)
    .flat_map(&:functions)
    .select(&:pending?)
end