Class: LLM::OpenAI::Responses

Inherits:
Object
  • Object
show all
Defined in:
lib/llm/providers/openai/responses.rb

Overview

The LLM::OpenAI::Responses class provides a responses object for interacting with OpenAI’s response API. The responses API is similar to the chat completions API but it can maintain conversation state across multiple requests. This is useful when you want to save bandwidth and/or not maintain the message thread by yourself.

Examples:

#!/usr/bin/env ruby
require "llm"

llm = LLM.openai(ENV["KEY"])
res1 = llm.responses.create "Your task is to help me with math", :developer
res2 = llm.responses.create "5 + 5  = ?", :user, previous_response_id: res1.id
[res1,res2].each { llm.responses.delete(_1) }

Instance Method Summary collapse

Constructor Details

#initialize(provider) ⇒ LLM::OpenAI::Responses

Returns a new Responses object

Parameters:



26
27
28
# File 'lib/llm/providers/openai/responses.rb', line 26

def initialize(provider)
  @provider = provider
end

Instance Method Details

#create(prompt, role = :user, model: "gpt-4o-mini", **params) ⇒ LLM::Response::Output

Create a response

Parameters:

  • params (Hash)

    Response params

  • prompt (String)

    The input prompt to be completed

  • role (Symbol) (defaults to: :user)

    The role of the prompt (e.g. :user, :system)

  • model (String) (defaults to: "gpt-4o-mini")

    The model to use for the completion

Returns:

Raises:

See Also:



39
40
41
42
43
44
45
46
# File 'lib/llm/providers/openai/responses.rb', line 39

def create(prompt, role = :user, model: "gpt-4o-mini", **params)
  params   = {model:}.merge!(params)
  req      = Net::HTTP::Post.new("/v1/responses", headers)
  messages = [*(params.delete(:input) || []), LLM::Message.new(role, prompt)]
  req.body = JSON.dump({input: format(messages, :response)}.merge!(params))
  res      = request(http, req)
  LLM::Response::Output.new(res).extend(response_parser)
end

#get(response, **params) ⇒ LLM::Response::Output

Get a response

Parameters:

  • response (#id, #to_s)

    Response ID

Returns:

Raises:

See Also:



54
55
56
57
58
59
60
# File 'lib/llm/providers/openai/responses.rb', line 54

def get(response, **params)
  response_id = response.respond_to?(:id) ? response.id : response
  query = URI.encode_www_form(params)
  req = Net::HTTP::Get.new("/v1/responses/#{response_id}?#{query}", headers)
  res = request(http, req)
  LLM::Response::Output.new(res).extend(response_parser)
end

#delete(response) ⇒ OpenStruct

Deletes a response

Parameters:

  • response (#id, #to_s)

    Response ID

Returns:

Raises:

See Also:



68
69
70
71
72
73
# File 'lib/llm/providers/openai/responses.rb', line 68

def delete(response)
  response_id = response.respond_to?(:id) ? response.id : response
  req = Net::HTTP::Delete.new("/v1/responses/#{response_id}", headers)
  res = request(http, req)
  OpenStruct.from_hash JSON.parse(res.body)
end