Class: LLM::OpenAI::Responses

Inherits:
Object
  • Object
show all
Defined in:
lib/llm/providers/openai/responses.rb

Overview

The LLM::OpenAI::Responses class provides a responses object for interacting with OpenAI’s response API. The responses API is similar to the chat completions API but it can maintain conversation state across multiple requests. This is useful when you want to save bandwidth and/or not maintain the message thread by yourself.

Examples:

#!/usr/bin/env ruby
require "llm"

llm = LLM.openai(ENV["KEY"])
res1 = llm.responses.create "Your task is to help me with math", :developer
res2 = llm.responses.create "5 + 5  = ?", :user, previous_response_id: res1.id
[res1,res2].each { llm.responses.delete(_1) }
#!/usr/bin/env ruby
require "llm"

llm  = LLM.openai(ENV["KEY"])
file = llm.files.create file: "/images/hat.png"
res  = llm.responses.create ["Describe the image", file]
#!/usr/bin/env ruby
require "llm"

llm  = LLM.openai(ENV["KEY"])
file = llm.files.create file: "/documents/freebsd.pdf"
res  = llm.responses.create ["Describe the document, file]

Instance Method Summary collapse

Constructor Details

#initialize(provider) ⇒ LLM::OpenAI::Responses

Returns a new Responses object

Parameters:



40
41
42
# File 'lib/llm/providers/openai/responses.rb', line 40

def initialize(provider)
  @provider = provider
end

Instance Method Details

#create(prompt, params = {}) ⇒ LLM::Response::Output

Create a response

Parameters:

  • prompt (String)

    The input prompt to be completed

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Returns:

  • (LLM::Response::Output)

Raises:

See Also:



53
54
55
56
57
58
59
60
61
62
63
# File 'lib/llm/providers/openai/responses.rb', line 53

def create(prompt, params = {})
  params = {role: :user, model: @provider.default_model}.merge!(params)
  params = [params, format_schema(params), format_tools(params)].inject({}, &:merge!).compact
  role = params.delete(:role)
  req = Net::HTTP::Post.new("/v1/responses", headers)
  messages = [*(params.delete(:input) || []), LLM::Message.new(role, prompt)]
  body = JSON.dump({input: [format(messages, :response)].flatten}.merge!(params))
  set_body_stream(req, StringIO.new(body))
  res = request(http, req)
  LLM::Response::Respond.new(res).extend(response_parser)
end

#get(response, **params) ⇒ LLM::Response::Output

Get a response

Parameters:

  • response (#id, #to_s)

    Response ID

Returns:

  • (LLM::Response::Output)

Raises:

See Also:



71
72
73
74
75
76
77
# File 'lib/llm/providers/openai/responses.rb', line 71

def get(response, **params)
  response_id = response.respond_to?(:id) ? response.id : response
  query = URI.encode_www_form(params)
  req = Net::HTTP::Get.new("/v1/responses/#{response_id}?#{query}", headers)
  res = request(http, req)
  LLM::Response::Respond.new(res).extend(response_parser)
end

#delete(response) ⇒ OpenStruct

Deletes a response

Parameters:

  • response (#id, #to_s)

    Response ID

Returns:

Raises:

See Also:



85
86
87
88
89
90
# File 'lib/llm/providers/openai/responses.rb', line 85

def delete(response)
  response_id = response.respond_to?(:id) ? response.id : response
  req = Net::HTTP::Delete.new("/v1/responses/#{response_id}", headers)
  res = request(http, req)
  OpenStruct.from_hash JSON.parse(res.body)
end