Class: LLM::Conversation

Inherits:
Object
  • Object
show all
Defined in:
lib/llm/conversation.rb

Overview

LLM::Conversation provides a conversation object that maintains a thread of messages that acts as context throughout the conversation.

Examples:

llm = LLM.openai(ENV["KEY"])
convo = llm.chat("You are my climate expert", :system)
convo.chat("What's the climate like in Rio de Janerio?", :user)
convo.chat("What's the climate like in Algiers?", :user)
convo.chat("What's the climate like in Tokyo?", :user)
p bot.messages.map { [_1.role, _1.content] }

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(provider, params = {}) ⇒ Conversation

Returns a new instance of Conversation.

Parameters:

  • provider (LLM::Provider)

    A provider

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation



25
26
27
28
29
30
# File 'lib/llm/conversation.rb', line 25

def initialize(provider, params = {})
  @provider = provider
  @params = params
  @lazy = false
  @messages = []
end

Instance Attribute Details

#messagesArray<LLM::Message> (readonly)

Returns:



18
19
20
# File 'lib/llm/conversation.rb', line 18

def messages
  @messages
end

Instance Method Details

#chat(prompt, role = :user, **params) ⇒ LLM::Conversation

Returns:



35
36
37
38
39
40
41
42
43
44
# File 'lib/llm/conversation.rb', line 35

def chat(prompt, role = :user, **params)
  tap do
    if lazy?
      @messages << [LLM::Message.new(role, prompt), @params.merge(params)]
    else
      completion = complete(prompt, role, params)
      @messages.concat [Message.new(role, prompt), completion.choices[0]]
    end
  end
end

#last_message(role: @provider.assistant_role) ⇒ LLM::Message Also known as: recent_message, read_response

Note:

The read_response and recent_message methods are aliases of the last_message method, and you can choose the name that best fits your context or code style.

Returns The last message for the given role.

Parameters:

  • role (#to_s) (defaults to: @provider.assistant_role)

    The role of the last message. Defaults to the LLM’s assistant role (eg “assistant” or “model”)

Returns:



56
57
58
# File 'lib/llm/conversation.rb', line 56

def last_message(role: @provider.assistant_role)
  messages.reverse_each.find { _1.role == role.to_s }
end

#lazyLLM::Conversation

Enables lazy mode for the conversation.

Returns:



65
66
67
68
69
70
71
# File 'lib/llm/conversation.rb', line 65

def lazy
  tap do
    next if lazy?
    @lazy = true
    @messages = LLM::MessageQueue.new(@provider)
  end
end

#lazy?Boolean

Returns true if the conversation is lazy

Returns:

  • (Boolean)

    Returns true if the conversation is lazy



76
77
78
# File 'lib/llm/conversation.rb', line 76

def lazy?
  @lazy
end