Class: LLM::Session

Inherits:
Object
  • Object
show all
Defined in:
lib/llm/bot.rb

Overview

LLM::Session provides an object that can maintain a conversation. A conversation can use the chat completions API that all LLM providers support or the responses API that currently only OpenAI supports.

Examples:

#!/usr/bin/env ruby
require "llm"

llm = LLM.openai(key: ENV["KEY"])
ses = LLM::Session.new(llm)

prompt = LLM::Prompt.new(llm) do
  system "Be concise and show your reasoning briefly."
  user "If a train goes 60 mph for 1.5 hours, how far does it travel?"
  user "Now double the speed for the same time."
end

ses.talk(prompt)
ses.messages.each { |m| puts "[#{m.role}] #{m.content}" }

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(provider, params = {}) ⇒ Session

Returns a new instance of Session.

Parameters:

  • provider (LLM::Provider)

    A provider

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Options Hash (params):

  • :model (String)

    Defaults to the provider's default model

  • :tools (Array<LLM::Function>, nil)

    Defaults to nil



40
41
42
43
44
# File 'lib/llm/bot.rb', line 40

def initialize(provider, params = {})
  @provider = provider
  @params = {model: provider.default_model, schema: nil}.compact.merge!(params)
  @messages = LLM::Buffer.new(provider)
end

Instance Attribute Details

#messagesLLM::Buffer<LLM::Message> (readonly)

Returns an Enumerable for the messages in a conversation



29
30
31
# File 'lib/llm/bot.rb', line 29

def messages
  @messages
end

Instance Method Details

#modelString

Returns the model a Session is actively using

Returns:

  • (String)


189
190
191
# File 'lib/llm/bot.rb', line 189

def model
  messages.find(&:assistant?)&.model || @params[:model]
end

#talk(prompt, params = {}) ⇒ LLM::Response Also known as: chat

Maintain a conversation via the chat completions API. This method immediately sends a request to the LLM and returns the response.

Examples:

llm = LLM.openai(key: ENV["KEY"])
ses = LLM::Session.new(llm)
res = ses.talk("Hello, what is your name?")
puts res.messages[0].content

Parameters:

  • params (defaults to: {})

    The params, including optional :role (defaults to :user), :stream, :tools, :schema etc.

  • prompt (String)

    The input prompt to be completed

Returns:



58
59
60
61
62
63
64
65
66
67
# File 'lib/llm/bot.rb', line 58

def talk(prompt, params = {})
  prompt, params, messages = fetch(prompt, params)
  params = params.merge(messages: [*@messages.to_a, *messages])
  params = @params.merge(params)
  res = @provider.complete(prompt, params)
  @messages.concat [LLM::Message.new(params[:role] || :user, prompt)]
  @messages.concat messages
  @messages.concat [res.choices[-1]]
  res
end

#respond(prompt, params = {}) ⇒ LLM::Response

Note:

Not all LLM providers support this API

Maintain a conversation via the responses API. This method immediately sends a request to the LLM and returns the response.

Examples:

llm = LLM.openai(key: ENV["KEY"])
ses = LLM::Session.new(llm)
res = ses.respond("What is the capital of France?")
puts res.output_text

Parameters:

  • params (defaults to: {})

    The params, including optional :role (defaults to :user), :stream, :tools, :schema etc.

  • prompt (String)

    The input prompt to be completed

Returns:



83
84
85
86
87
88
89
90
91
92
93
# File 'lib/llm/bot.rb', line 83

def respond(prompt, params = {})
  prompt, params, messages = fetch(prompt, params)
  res_id = @messages.find(&:assistant?)&.response&.response_id
  params = params.merge(previous_response_id: res_id, input: messages).compact
  params = @params.merge(params)
  res = @provider.responses.create(prompt, params)
  @messages.concat [LLM::Message.new(params[:role] || :user, prompt)]
  @messages.concat messages
  @messages.concat [res.choices[-1]]
  res
end

#inspectString

Returns:

  • (String)


97
98
99
100
101
# File 'lib/llm/bot.rb', line 97

def inspect
  "#<#{self.class.name}:0x#{object_id.to_s(16)} " \
  "@provider=#{@provider.class}, @params=#{@params.inspect}, " \
  "@messages=#{@messages.inspect}>"
end

#functionsArray<LLM::Function>

Returns an array of functions that can be called

Returns:



106
107
108
109
110
111
112
113
114
115
116
# File 'lib/llm/bot.rb', line 106

def functions
  @messages
    .select(&:assistant?)
    .flat_map do |msg|
      fns = msg.functions.select(&:pending?)
      fns.each do |fn|
        fn.tracer = tracer
        fn.model  = msg.model
      end
    end
end

#usageLLM::Object

Note:

Returns token usage for the conversation This method returns token usage for the latest assistant message, and it returns an empty object if there are no assistant messages

Returns:



125
126
127
# File 'lib/llm/bot.rb', line 125

def usage
  @messages.find(&:assistant?)&.usage || LLM::Object.from({})
end

#prompt(&b) ⇒ LLM::Prompt Also known as: build_prompt

Build a role-aware prompt for a single request.

Prefer this method over #build_prompt. The older method name is kept for backward compatibility.

Examples:

prompt = ses.prompt do
  system "Your task is to assist the user"
  user "Hello, can you assist me?"
end
ses.talk(prompt)

Parameters:

  • b (Proc)

    A block that composes messages. If it takes one argument, it receives the prompt object. Otherwise it runs in prompt context.

Returns:



144
145
146
# File 'lib/llm/bot.rb', line 144

def prompt(&b)
  LLM::Prompt.new(@provider, &b)
end

#image_url(url) ⇒ LLM::Object

Recongize an object as a URL to an image

Parameters:

  • url (String)

    The URL

Returns:



155
156
157
# File 'lib/llm/bot.rb', line 155

def image_url(url)
  LLM::Object.from(value: url, kind: :image_url)
end

#local_file(path) ⇒ LLM::Object

Recongize an object as a local file

Parameters:

  • path (String)

    The path

Returns:



165
166
167
# File 'lib/llm/bot.rb', line 165

def local_file(path)
  LLM::Object.from(value: LLM.File(path), kind: :local_file)
end

#remote_file(res) ⇒ LLM::Object

Reconginize an object as a remote file

Parameters:

Returns:



175
176
177
# File 'lib/llm/bot.rb', line 175

def remote_file(res)
  LLM::Object.from(value: res, kind: :remote_file)
end

#tracerLLM::Tracer

Returns an LLM tracer

Returns:



182
183
184
# File 'lib/llm/bot.rb', line 182

def tracer
  @provider.tracer
end