Class: LLM::OpenAI::Responses
- Inherits:
-
Object
- Object
- LLM::OpenAI::Responses
- Defined in:
- lib/llm/providers/openai/responses.rb
Overview
The LLM::OpenAI::Responses class provides a responses object for interacting with OpenAI’s response API. The responses API is similar to the chat completions API but it can maintain conversation state across multiple requests. This is useful when you want to save bandwidth and/or not maintain the message thread by yourself.
Instance Method Summary collapse
-
#initialize(provider) ⇒ LLM::OpenAI::Responses
constructor
Returns a new Responses object.
-
#create(prompt, params = {}) ⇒ LLM::Response::Output
Create a response.
-
#get(response, **params) ⇒ LLM::Response::Output
Get a response.
-
#delete(response) ⇒ LLM::Object
Deletes a response.
Constructor Details
#initialize(provider) ⇒ LLM::OpenAI::Responses
Returns a new Responses object
42 43 44 |
# File 'lib/llm/providers/openai/responses.rb', line 42 def initialize(provider) @provider = provider end |
Instance Method Details
#create(prompt, params = {}) ⇒ LLM::Response::Output
Create a response
55 56 57 58 59 60 61 62 63 64 65 |
# File 'lib/llm/providers/openai/responses.rb', line 55 def create(prompt, params = {}) params = {role: :user, model: @provider.default_model}.merge!(params) params = [params, format_schema(params), format_tools(params)].inject({}, &:merge!).compact role = params.delete(:role) req = Net::HTTP::Post.new("/v1/responses", headers) = [*(params.delete(:input) || []), LLM::Message.new(role, prompt)] body = JSON.dump({input: [format(, :response)].flatten}.merge!(params)) set_body_stream(req, StringIO.new(body)) res = execute(request: req) LLM::Response::Respond.new(res).extend(response_parser) end |
#get(response, **params) ⇒ LLM::Response::Output
Get a response
73 74 75 76 77 78 79 |
# File 'lib/llm/providers/openai/responses.rb', line 73 def get(response, **params) response_id = response.respond_to?(:id) ? response.id : response query = URI.encode_www_form(params) req = Net::HTTP::Get.new("/v1/responses/#{response_id}?#{query}", headers) res = execute(request: req) LLM::Response::Respond.new(res).extend(response_parser) end |
#delete(response) ⇒ LLM::Object
Deletes a response
87 88 89 90 91 92 |
# File 'lib/llm/providers/openai/responses.rb', line 87 def delete(response) response_id = response.respond_to?(:id) ? response.id : response req = Net::HTTP::Delete.new("/v1/responses/#{response_id}", headers) res = execute(request: req) LLM::Object.from_hash JSON.parse(res.body) end |