Class: LLM::Gemini
- Defined in:
- lib/llm/providers/gemini.rb,
lib/llm/providers/gemini/audio.rb,
lib/llm/providers/gemini/files.rb,
lib/llm/providers/gemini/images.rb,
lib/llm/providers/gemini/models.rb,
lib/llm/providers/gemini/error_handler.rb,
lib/llm/providers/gemini/stream_parser.rb,
lib/llm/providers/gemini/request_adapter.rb,
lib/llm/providers/gemini/response_adapter.rb
Overview
The Gemini class implements a provider for Gemini. The Gemini provider can accept multiple inputs (text, images, audio, and video). The inputs can be provided inline via the prompt for files under 20MB or via the Gemini Files API for files that are over 20MB.
Defined Under Namespace
Classes: Audio, Files, Images, Models
Constant Summary collapse
- HOST =
-
"generativelanguage.googleapis.com"
Instance Method Summary collapse
-
#assistant_role
⇒ String
Returns the role of the assistant in the conversation.
-
#embed(input,
model: "text-embedding-004", **params) ⇒ LLM::Response
Provides an embedding.
-
#complete(prompt,
params = {}) ⇒ LLM::Response
Provides an interface to the chat completions API.
-
#audio ⇒
LLM::Gemini::Audio
Provides an interface to Gemini's audio API.
-
#images ⇒ see
LLM::Gemini::Images
Provides an interface to Gemini's image generation API.
-
#files ⇒
LLM::Gemini::Files
Provides an interface to Gemini's file management API.
-
#models ⇒
LLM::Gemini::Models
Provides an interface to Gemini's models API.
-
#default_model ⇒
String
Returns the default model for chat completions.
- #server_tools ⇒ String => LLM::ServerTool
-
#web_search(query:)
⇒ LLM::Response
A convenience method for performing a web search using the Google Search tool.
-
#user_role ⇒
Symbol
Returns the providers user role.
-
#system_role ⇒
Symbol
Returns the providers system role.
-
#developer_role
⇒ Symbol
Returns the providers developer role.
-
#initialize ⇒
Gemini constructor
A new instance of Gemini.
Methods inherited from Provider
#chat, clients, #inspect, #moderations, #respond, #responses, #schema, #server_tool, #tracer, #tracer=, #vector_stores, #with
Constructor Details
Instance Method Details
#assistant_role ⇒ String
Returns the role of the assistant in the conversation. Usually "assistant" or "model"
163 164 165 |
# File 'lib/llm/providers/gemini.rb', line 163 def assistant_role "model" end |
#embed(input, model: "text-embedding-004", **params) ⇒ LLM::Response
Provides an embedding
47 48 49 50 51 52 53 54 55 |
# File 'lib/llm/providers/gemini.rb', line 47 def (input, model: "text-embedding-004", **params) model = model.respond_to?(:id) ? model.id : model path = ["/v1beta/models/#{model}", "embedContent?key=#{@key}"].join(":") req = Net::HTTP::Post.new(path, headers) req.body = LLM.json.dump({content: {parts: [{text: input}]}}) res, span = execute(request: req, operation: "embeddings", model:) res = ResponseAdapter.adapt(res, type: :embedding) finish_trace(operation: "embeddings", model:, res:, span:) end |
#complete(prompt, params = {}) ⇒ LLM::Response
Provides an interface to the chat completions API
67 68 69 70 71 72 73 74 |
# File 'lib/llm/providers/gemini.rb', line 67 def complete(prompt, params = {}) params, stream, tools, role, model = normalize_complete_params(params) req = build_complete_request(prompt, params, role, model, stream) res, span = execute(request: req, stream: stream, operation: "chat", model:) res = ResponseAdapter.adapt(res, type: :completion) .extend(Module.new { define_method(:__tools__) { tools } }) finish_trace(operation: "chat", model:, res:, span:) end |
#audio ⇒ LLM::Gemini::Audio
Provides an interface to Gemini's audio API
80 81 82 |
# File 'lib/llm/providers/gemini.rb', line 80 def audio LLM::Gemini::Audio.new(self) end |
#images ⇒ see LLM::Gemini::Images
Provides an interface to Gemini's image generation API
88 89 90 |
# File 'lib/llm/providers/gemini.rb', line 88 def images LLM::Gemini::Images.new(self) end |
#files ⇒ LLM::Gemini::Files
Provides an interface to Gemini's file management API
96 97 98 |
# File 'lib/llm/providers/gemini.rb', line 96 def files LLM::Gemini::Files.new(self) end |
#models ⇒ LLM::Gemini::Models
Provides an interface to Gemini's models API
104 105 106 |
# File 'lib/llm/providers/gemini.rb', line 104 def models LLM::Gemini::Models.new(self) end |
#default_model ⇒ String
Returns the default model for chat completions
112 113 114 |
# File 'lib/llm/providers/gemini.rb', line 112 def default_model "gemini-2.5-flash" end |
#server_tools ⇒ String => LLM::ServerTool
This method includes certain tools that require configuration through a set of options that are easier to set through the LLM::Provider#server_tool method.
123 124 125 126 127 128 129 |
# File 'lib/llm/providers/gemini.rb', line 123 def server_tools { google_search: server_tool(:google_search), code_execution: server_tool(:code_execution), url_context: server_tool(:url_context) } end |
#web_search(query:) ⇒ LLM::Response
A convenience method for performing a web search using the Google Search tool.
136 137 138 |
# File 'lib/llm/providers/gemini.rb', line 136 def web_search(query:) ResponseAdapter.adapt(complete(query, tools: [server_tools[:google_search]]), type: :web_search) end |
#user_role ⇒ Symbol
Returns the providers user role
143 144 145 |
# File 'lib/llm/providers/gemini.rb', line 143 def user_role :user end |
#system_role ⇒ Symbol
Returns the providers system role
150 151 152 |
# File 'lib/llm/providers/gemini.rb', line 150 def system_role :user end |
#developer_role ⇒ Symbol
Returns the providers developer role
157 158 159 |
# File 'lib/llm/providers/gemini.rb', line 157 def developer_role :user end |