Class: LLM::OpenAI
- Defined in:
- lib/llm/providers/openai.rb,
lib/llm/providers/openai/audio.rb,
lib/llm/providers/openai/files.rb,
lib/llm/providers/openai/format.rb,
lib/llm/providers/openai/images.rb,
lib/llm/providers/openai/models.rb,
lib/llm/providers/openai/responses.rb,
lib/llm/providers/openai/error_handler.rb,
lib/llm/providers/openai/response_parser.rb
Overview
The OpenAI class implements a provider for OpenAI
Direct Known Subclasses
Defined Under Namespace
Classes: Audio, Files, Images, Models, Responses
Constant Summary collapse
- HOST =
"api.openai.com"
Instance Method Summary collapse
-
#default_model ⇒ String
Returns the default model for chat completions.
-
#embed(input, model: "text-embedding-3-small", **params) ⇒ LLM::Response::Embedding
Provides an embedding.
-
#complete(prompt, params = {}) ⇒ LLM::Response::Completion
Provides an interface to the chat completions API.
-
#responses ⇒ LLM::OpenAI::Responses
Provides an interface to OpenAI’s response API.
-
#images ⇒ LLM::OpenAI::Images
Provides an interface to OpenAI’s image generation API.
-
#audio ⇒ LLM::OpenAI::Audio
Provides an interface to OpenAI’s audio generation API.
-
#files ⇒ LLM::OpenAI::Files
Provides an interface to OpenAI’s files API.
-
#models ⇒ LLM::OpenAI::Models
Provides an interface to OpenAI’s models API.
-
#assistant_role ⇒ String
Returns the role of the assistant in the conversation.
-
#initialize ⇒ OpenAI
constructor
A new instance of OpenAI.
Methods inherited from Provider
#chat, #chat!, #inspect, #respond, #respond!, #schema, #with
Constructor Details
Instance Method Details
#default_model ⇒ String
Returns the default model for chat completions
115 116 117 |
# File 'lib/llm/providers/openai.rb', line 115 def default_model "gpt-4o-mini" end |
#embed(input, model: "text-embedding-3-small", **params) ⇒ LLM::Response::Embedding
Provides an embedding
36 37 38 39 40 41 |
# File 'lib/llm/providers/openai.rb', line 36 def (input, model: "text-embedding-3-small", **params) req = Net::HTTP::Post.new("/v1/embeddings", headers) req.body = JSON.dump({input:, model:}.merge!(params)) res = request(@http, req) Response::Embedding.new(res).extend(response_parser) end |
#complete(prompt, params = {}) ⇒ LLM::Response::Completion
Provides an interface to the chat completions API
53 54 55 56 57 58 59 60 61 62 63 |
# File 'lib/llm/providers/openai.rb', line 53 def complete(prompt, params = {}) params = {role: :user, model: default_model}.merge!(params) params = [params, format_schema(params), format_tools(params)].inject({}, &:merge!).compact role = params.delete(:role) req = Net::HTTP::Post.new("/v1/chat/completions", headers) = [*(params.delete(:messages) || []), Message.new(role, prompt)] body = JSON.dump({messages: format(, :complete).flatten}.merge!(params)) set_body_stream(req, StringIO.new(body)) res = request(@http, req) Response::Completion.new(res).extend(response_parser) end |
#responses ⇒ LLM::OpenAI::Responses
Provides an interface to OpenAI’s response API
69 70 71 |
# File 'lib/llm/providers/openai.rb', line 69 def responses LLM::OpenAI::Responses.new(self) end |
#images ⇒ LLM::OpenAI::Images
Provides an interface to OpenAI’s image generation API
77 78 79 |
# File 'lib/llm/providers/openai.rb', line 77 def images LLM::OpenAI::Images.new(self) end |
#audio ⇒ LLM::OpenAI::Audio
Provides an interface to OpenAI’s audio generation API
85 86 87 |
# File 'lib/llm/providers/openai.rb', line 85 def audio LLM::OpenAI::Audio.new(self) end |
#files ⇒ LLM::OpenAI::Files
Provides an interface to OpenAI’s files API
93 94 95 |
# File 'lib/llm/providers/openai.rb', line 93 def files LLM::OpenAI::Files.new(self) end |
#models ⇒ LLM::OpenAI::Models
Provides an interface to OpenAI’s models API
101 102 103 |
# File 'lib/llm/providers/openai.rb', line 101 def models LLM::OpenAI::Models.new(self) end |
#assistant_role ⇒ String
Returns the role of the assistant in the conversation. Usually “assistant” or “model”
107 108 109 |
# File 'lib/llm/providers/openai.rb', line 107 def assistant_role "assistant" end |