Class: LLM::OpenAI::Responses

Inherits:
Object
  • Object
show all
Defined in:
lib/llm/providers/openai/responses.rb

Overview

The LLM::OpenAI::Responses class provides an interface for OpenAI’s response API.

Examples:

example #1

#!/usr/bin/env ruby
require "llm"

llm = LLM.openai(key: ENV["KEY"])
res1 = llm.responses.create "Your task is to answer the user's questions", role: :developer
res2 = llm.responses.create "5 + 5 = X ?", role: :user, previous_response_id: res1.id
[res1, res2].each { llm.responses.delete(_1) }

Instance Method Summary collapse

Constructor Details

#initialize(provider) ⇒ LLM::OpenAI::Responses

Returns a new Responses object

Parameters:



25
26
27
# File 'lib/llm/providers/openai/responses.rb', line 25

def initialize(provider)
  @provider = provider
end

Instance Method Details

#create(prompt, params = {}) ⇒ LLM::Response

Create a response

Parameters:

  • prompt (String)

    The input prompt to be completed

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Returns:

Raises:

See Also:



38
39
40
41
42
43
44
45
46
47
48
49
# File 'lib/llm/providers/openai/responses.rb', line 38

def create(prompt, params = {})
  params = {role: :user, model: @provider.default_model}.merge!(params)
  params = [params, format_schema(params), format_tools(params)].inject({}, &:merge!).compact
  role, stream = params.delete(:role), params.delete(:stream)
  params[:stream] = true if stream.respond_to?(:<<) || stream == true
  req = Net::HTTP::Post.new("/v1/responses", headers)
  messages = [*(params.delete(:input) || []), LLM::Message.new(role, prompt)]
  body = JSON.dump({input: [format(messages, :response)].flatten}.merge!(params))
  set_body_stream(req, StringIO.new(body))
  res = execute(request: req, stream:, stream_parser:)
  LLM::Response.new(res).extend(LLM::OpenAI::Response::Responds)
end

#get(response, **params) ⇒ LLM::Response

Get a response

Parameters:

  • response (#id, #to_s)

    Response ID

Returns:

See Also:



57
58
59
60
61
62
63
# File 'lib/llm/providers/openai/responses.rb', line 57

def get(response, **params)
  response_id = response.respond_to?(:id) ? response.id : response
  query = URI.encode_www_form(params)
  req = Net::HTTP::Get.new("/v1/responses/#{response_id}?#{query}", headers)
  res = execute(request: req)
  LLM::Response.new(res).extend(LLM::OpenAI::Response::Responds)
end

#delete(response) ⇒ LLM::Object

Deletes a response

Parameters:

  • response (#id, #to_s)

    Response ID

Returns:

See Also:



71
72
73
74
75
76
# File 'lib/llm/providers/openai/responses.rb', line 71

def delete(response)
  response_id = response.respond_to?(:id) ? response.id : response
  req = Net::HTTP::Delete.new("/v1/responses/#{response_id}", headers)
  res = execute(request: req)
  LLM::Response.new(res)
end