Class: LLM::OpenAI
- Defined in:
- lib/llm/providers/openai.rb,
lib/llm/providers/openai/audio.rb,
lib/llm/providers/openai/files.rb,
lib/llm/providers/openai/format.rb,
lib/llm/providers/openai/images.rb,
lib/llm/providers/openai/models.rb,
lib/llm/providers/openai/responses.rb,
lib/llm/providers/openai/moderations.rb,
lib/llm/providers/openai/error_handler.rb,
lib/llm/providers/openai/stream_parser.rb,
lib/llm/providers/openai/vector_stores.rb,
lib/llm/providers/openai/responses/stream_parser.rb
Overview
The OpenAI class implements a provider for OpenAI.
Defined Under Namespace
Modules: Response Classes: Audio, Files, Images, Models, Moderations, Responses, VectorStores
Constant Summary collapse
- HOST =
"api.openai.com"
Instance Method Summary collapse
-
#web_search(query:) ⇒ LLM::Response
A convenience method for performing a web search using the OpenAI web search tool.
-
#embed(input, model: "text-embedding-3-small", **params) ⇒ LLM::Response
Provides an embedding.
-
#complete(prompt, params = {}) ⇒ LLM::Response
Provides an interface to the chat completions API.
-
#responses ⇒ LLM::OpenAI::Responses
Provides an interface to OpenAI’s response API.
-
#images ⇒ LLM::OpenAI::Images
Provides an interface to OpenAI’s image generation API.
-
#audio ⇒ LLM::OpenAI::Audio
Provides an interface to OpenAI’s audio generation API.
-
#files ⇒ LLM::OpenAI::Files
Provides an interface to OpenAI’s files API.
-
#models ⇒ LLM::OpenAI::Models
Provides an interface to OpenAI’s models API.
-
#moderations ⇒ LLM::OpenAI::Moderations
Provides an interface to OpenAI’s moderation API.
-
#vector_stores ⇒ LLM::OpenAI::VectorStore
Provides an interface to OpenAI’s vector store API.
-
#assistant_role ⇒ String
Returns the role of the assistant in the conversation.
-
#default_model ⇒ String
Returns the default model for chat completions.
-
#tools ⇒ String => LLM::Tool
-
#initialize ⇒ OpenAI
constructor
A new instance of OpenAI.
Methods inherited from Provider
#chat, #chat!, clients, #inspect, mutex, #respond, #respond!, #schema, #tool, #with
Constructor Details
Instance Method Details
#web_search(query:) ⇒ LLM::Response
A convenience method for performing a web search using the OpenAI web search tool.
175 176 177 178 179 |
# File 'lib/llm/providers/openai.rb', line 175 def web_search(query:) responses .create(query, store: false, tools: [tools[:web_search]]) .extend(LLM::OpenAI::Response::WebSearch) end |
#embed(input, model: "text-embedding-3-small", **params) ⇒ LLM::Response
Provides an embedding
49 50 51 52 53 54 |
# File 'lib/llm/providers/openai.rb', line 49 def (input, model: "text-embedding-3-small", **params) req = Net::HTTP::Post.new("/v1/embeddings", headers) req.body = JSON.dump({input:, model:}.merge!(params)) res = execute(request: req) LLM::Response.new(res).extend(LLM::OpenAI::Response::Embedding) end |
#complete(prompt, params = {}) ⇒ LLM::Response
Provides an interface to the chat completions API
66 67 68 69 70 71 72 73 74 75 76 77 |
# File 'lib/llm/providers/openai.rb', line 66 def complete(prompt, params = {}) params = {role: :user, model: default_model}.merge!(params) params = [params, format_schema(params), format_tools(params)].inject({}, &:merge!).compact role, stream = params.delete(:role), params.delete(:stream) params[:stream] = true if stream.respond_to?(:<<) || stream == true req = Net::HTTP::Post.new("/v1/chat/completions", headers) = [*(params.delete(:messages) || []), Message.new(role, prompt)] body = JSON.dump({messages: format(, :complete).flatten}.merge!(params)) set_body_stream(req, StringIO.new(body)) res = execute(request: req, stream:) LLM::Response.new(res).extend(LLM::OpenAI::Response::Completion) end |
#responses ⇒ LLM::OpenAI::Responses
Provides an interface to OpenAI’s response API
83 84 85 |
# File 'lib/llm/providers/openai.rb', line 83 def responses LLM::OpenAI::Responses.new(self) end |
#images ⇒ LLM::OpenAI::Images
Provides an interface to OpenAI’s image generation API
91 92 93 |
# File 'lib/llm/providers/openai.rb', line 91 def images LLM::OpenAI::Images.new(self) end |
#audio ⇒ LLM::OpenAI::Audio
Provides an interface to OpenAI’s audio generation API
99 100 101 |
# File 'lib/llm/providers/openai.rb', line 99 def audio LLM::OpenAI::Audio.new(self) end |
#files ⇒ LLM::OpenAI::Files
Provides an interface to OpenAI’s files API
107 108 109 |
# File 'lib/llm/providers/openai.rb', line 107 def files LLM::OpenAI::Files.new(self) end |
#models ⇒ LLM::OpenAI::Models
Provides an interface to OpenAI’s models API
115 116 117 |
# File 'lib/llm/providers/openai.rb', line 115 def models LLM::OpenAI::Models.new(self) end |
#moderations ⇒ LLM::OpenAI::Moderations
Provides an interface to OpenAI’s moderation API
124 125 126 |
# File 'lib/llm/providers/openai.rb', line 124 def moderations LLM::OpenAI::Moderations.new(self) end |
#vector_stores ⇒ LLM::OpenAI::VectorStore
Provides an interface to OpenAI’s vector store API
132 133 134 |
# File 'lib/llm/providers/openai.rb', line 132 def vector_stores LLM::OpenAI::VectorStores.new(self) end |
#assistant_role ⇒ String
Returns the role of the assistant in the conversation. Usually “assistant” or “model”
138 139 140 |
# File 'lib/llm/providers/openai.rb', line 138 def assistant_role "assistant" end |
#default_model ⇒ String
Returns the default model for chat completions
146 147 148 |
# File 'lib/llm/providers/openai.rb', line 146 def default_model "gpt-4.1" end |
#tools ⇒ String => LLM::Tool
This method includes certain tools that require configuration through a set of options that are easier to set through the LLM::Provider#tool method.
156 157 158 159 160 161 162 163 164 |
# File 'lib/llm/providers/openai.rb', line 156 def tools { web_search: tool(:web_search), file_search: tool(:file_search), image_generation: tool(:image_generation), code_interpreter: tool(:code_interpreter), computer_use: tool(:computer_use) } end |