Class: LLM::OpenAI
- Defined in:
- lib/llm/providers/openai.rb,
lib/llm/providers/openai/audio.rb,
lib/llm/providers/openai/files.rb,
lib/llm/providers/openai/images.rb,
lib/llm/providers/openai/models.rb,
lib/llm/providers/openai/responses.rb,
lib/llm/providers/openai/moderations.rb,
lib/llm/providers/openai/error_handler.rb,
lib/llm/providers/openai/stream_parser.rb,
lib/llm/providers/openai/vector_stores.rb,
lib/llm/providers/openai/request_adapter.rb,
lib/llm/providers/openai/response_adapter.rb,
lib/llm/providers/openai/responses/stream_parser.rb
Overview
The OpenAI class implements a provider for OpenAI.
Defined Under Namespace
Classes: Audio, Files, Images, Models, Moderations, Responses, VectorStores
Constant Summary collapse
- HOST =
-
"api.openai.com"
Instance Method Summary collapse
-
#web_search(query:)
⇒ LLM::Response
A convenience method for performing a web search using the OpenAI web search tool.
-
#name ⇒
Symbol
Returns the provider's name.
-
#embed(input,
model: "text-embedding-3-small", **params) ⇒ LLM::Response
Provides an embedding.
-
#complete(prompt,
params = {}) ⇒ LLM::Response
Provides an interface to the chat completions API.
-
#responses ⇒
LLM::OpenAI::Responses
Provides an interface to OpenAI's response API.
-
#images ⇒
LLM::OpenAI::Images
Provides an interface to OpenAI's image generation API.
-
#audio ⇒
LLM::OpenAI::Audio
Provides an interface to OpenAI's audio generation API.
-
#files ⇒
LLM::OpenAI::Files
Provides an interface to OpenAI's files API.
-
#models ⇒
LLM::OpenAI::Models
Provides an interface to OpenAI's models API.
-
#moderations ⇒
LLM::OpenAI::Moderations
Provides an interface to OpenAI's moderation API.
-
#vector_stores ⇒
LLM::OpenAI::VectorStore
Provides an interface to OpenAI's vector store API.
-
#assistant_role
⇒ String
Returns the role of the assistant in the conversation.
-
#default_model ⇒
String
Returns the default model for chat completions.
- #server_tools ⇒ String => LLM::ServerTool
-
#initialize ⇒
OpenAI constructor
A new instance of OpenAI.
Methods inherited from Provider
#chat, clients, #developer_role, #inspect, #persist!, #respond, #schema, #server_tool, #system_role, #tool_role, #tracer, #tracer=, #user_role, #with
Constructor Details
Instance Method Details
#web_search(query:) ⇒ LLM::Response
A convenience method for performing a web search using the OpenAI web search tool.
180 181 182 183 184 185 |
# File 'lib/llm/providers/openai.rb', line 180 def web_search(query:) ResponseAdapter.adapt( responses.create(query, store: false, tools: [server_tools[:web_search]]), type: :web_search ) end |
#name ⇒ Symbol
Returns the provider's name
42 43 44 |
# File 'lib/llm/providers/openai.rb', line 42 def name :openai end |
#embed(input, model: "text-embedding-3-small", **params) ⇒ LLM::Response
Provides an embedding
54 55 56 57 58 59 60 61 |
# File 'lib/llm/providers/openai.rb', line 54 def (input, model: "text-embedding-3-small", **params) req = Net::HTTP::Post.new("/v1/embeddings", headers) req.body = LLM.json.dump({input:, model:}.merge!(params)) res, span, tracer = execute(request: req, operation: "embeddings", model:) res = ResponseAdapter.adapt(res, type: :embedding) tracer.on_request_finish(operation: "embeddings", model:, res:, span:) res end |
#complete(prompt, params = {}) ⇒ LLM::Response
Provides an interface to the chat completions API
73 74 75 76 77 78 79 80 81 82 |
# File 'lib/llm/providers/openai.rb', line 73 def complete(prompt, params = {}) params, stream, tools, role = normalize_complete_params(params) req, = build_complete_request(prompt, params, role) tracer.(user_input: extract_user_input(, fallback: prompt)) res, span, tracer = execute(request: req, stream: stream, operation: "chat", model: params[:model]) res = ResponseAdapter.adapt(res, type: :completion) .extend(Module.new { define_method(:__tools__) { tools } }) tracer.on_request_finish(operation: "chat", model: params[:model], res:, span:) res end |
#responses ⇒ LLM::OpenAI::Responses
Provides an interface to OpenAI's response API
88 89 90 |
# File 'lib/llm/providers/openai.rb', line 88 def responses LLM::OpenAI::Responses.new(self) end |
#images ⇒ LLM::OpenAI::Images
Provides an interface to OpenAI's image generation API
96 97 98 |
# File 'lib/llm/providers/openai.rb', line 96 def images LLM::OpenAI::Images.new(self) end |
#audio ⇒ LLM::OpenAI::Audio
Provides an interface to OpenAI's audio generation API
104 105 106 |
# File 'lib/llm/providers/openai.rb', line 104 def audio LLM::OpenAI::Audio.new(self) end |
#files ⇒ LLM::OpenAI::Files
Provides an interface to OpenAI's files API
112 113 114 |
# File 'lib/llm/providers/openai.rb', line 112 def files LLM::OpenAI::Files.new(self) end |
#models ⇒ LLM::OpenAI::Models
Provides an interface to OpenAI's models API
120 121 122 |
# File 'lib/llm/providers/openai.rb', line 120 def models LLM::OpenAI::Models.new(self) end |
#moderations ⇒ LLM::OpenAI::Moderations
Provides an interface to OpenAI's moderation API
129 130 131 |
# File 'lib/llm/providers/openai.rb', line 129 def moderations LLM::OpenAI::Moderations.new(self) end |
#vector_stores ⇒ LLM::OpenAI::VectorStore
Provides an interface to OpenAI's vector store API
137 138 139 |
# File 'lib/llm/providers/openai.rb', line 137 def vector_stores LLM::OpenAI::VectorStores.new(self) end |
#assistant_role ⇒ String
Returns the role of the assistant in the conversation. Usually "assistant" or "model"
143 144 145 |
# File 'lib/llm/providers/openai.rb', line 143 def assistant_role "assistant" end |
#default_model ⇒ String
Returns the default model for chat completions
151 152 153 |
# File 'lib/llm/providers/openai.rb', line 151 def default_model "gpt-4.1" end |
#server_tools ⇒ String => LLM::ServerTool
This method includes certain tools that require configuration through a set of options that are easier to set through the LLM::Provider#server_tool method.
161 162 163 164 165 166 167 168 169 |
# File 'lib/llm/providers/openai.rb', line 161 def server_tools { web_search: server_tool(:web_search), file_search: server_tool(:file_search), image_generation: server_tool(:image_generation), code_interpreter: server_tool(:code_interpreter), computer_use: server_tool(:computer_use) } end |