Class: LLM::Google
- Defined in:
- lib/llm/providers/google.rb,
lib/llm/providers/google/audio.rb,
lib/llm/providers/google/files.rb,
lib/llm/providers/google/utils.rb,
lib/llm/providers/google/images.rb,
lib/llm/providers/google/models.rb,
lib/llm/providers/google/error_handler.rb,
lib/llm/providers/google/stream_parser.rb,
lib/llm/providers/google/request_adapter.rb,
lib/llm/providers/google/response_adapter.rb
Overview
The Google class implements a provider for Gemini. The Google provider can accept multiple inputs (text, images, audio, and video). The inputs can be provided inline via the prompt for files under 20MB or via the Gemini Files API for files that are over 20MB.
Defined Under Namespace
Modules: Utils Classes: Audio, Files, Images, Models
Constant Summary collapse
- HOST =
-
"generativelanguage.googleapis.com"
Instance Method Summary collapse
-
#assistant_role
⇒ String
Returns the role of the assistant in the conversation.
-
#name ⇒
Symbol
Returns the provider's name.
-
#embed(input,
model: "gemini-embedding-001", **params) ⇒ LLM::Response
Provides an embedding.
-
#complete(prompt,
params = {}) ⇒ LLM::Response
Provides an interface to the chat completions API.
-
#audio ⇒
LLM::Google::Audio
Provides an interface to Gemini's audio API.
-
#images ⇒ see
LLM::Google::Images
Provides an interface to Gemini's image generation API.
-
#files ⇒
LLM::Google::Files
Provides an interface to Gemini's file management API.
-
#models ⇒
LLM::Google::Models
Provides an interface to Gemini's models API.
-
#default_model ⇒
String
Returns the default model for chat completions.
- #server_tools ⇒ String => LLM::ServerTool
-
#web_search(query:)
⇒ LLM::Response
A convenience method for performing a web search using the Google Search tool.
-
#user_role ⇒
Symbol
Returns the providers user role.
-
#system_role ⇒
Symbol
Returns the providers system role.
-
#developer_role
⇒ Symbol
Returns the providers developer role.
-
#initialize ⇒
Google constructor
A new instance of Google.
Methods inherited from Provider
#chat, #inspect, #interrupt!, #moderations, #persist!, #respond, #responses, #schema, #server_tool, #streamable?, #tool_role, #tracer, #tracer=, #vector_stores, #with
Constructor Details
Instance Method Details
#assistant_role ⇒ String
Returns the role of the assistant in the conversation. Usually "assistant" or "model"
174 175 176 |
# File 'lib/llm/providers/google.rb', line 174 def assistant_role "model" end |
#name ⇒ Symbol
Returns the provider's name
45 46 47 |
# File 'lib/llm/providers/google.rb', line 45 def name :google end |
#embed(input, model: "gemini-embedding-001", **params) ⇒ LLM::Response
Provides an embedding
56 57 58 59 60 61 62 63 64 65 |
# File 'lib/llm/providers/google.rb', line 56 def (input, model: "gemini-embedding-001", **params) model = model.respond_to?(:id) ? model.id : model path = ["/v1beta/models/#{model}", "embedContent?key=#{@key}"].join(":") req = Net::HTTP::Post.new(path, headers) req.body = LLM.json.dump({content: {parts: [{text: input}]}}) res, span, tracer = execute(request: req, operation: "embeddings", model:) res = ResponseAdapter.adapt(res, type: :embedding) tracer.on_request_finish(operation: "embeddings", model:, res:, span:) res end |
#complete(prompt, params = {}) ⇒ LLM::Response
Provides an interface to the chat completions API
77 78 79 80 81 82 83 84 85 |
# File 'lib/llm/providers/google.rb', line 77 def complete(prompt, params = {}) params, stream, tools, role, model = normalize_complete_params(params) req = build_complete_request(prompt, params, role, model, stream) res, span, tracer = execute(request: req, stream: stream, operation: "chat", model:) res = ResponseAdapter.adapt(res, type: :completion) .extend(Module.new { define_method(:__tools__) { tools } }) tracer.on_request_finish(operation: "chat", model:, res:, span:) res end |
#audio ⇒ LLM::Google::Audio
Provides an interface to Gemini's audio API
91 92 93 |
# File 'lib/llm/providers/google.rb', line 91 def audio LLM::Google::Audio.new(self) end |
#images ⇒ see LLM::Google::Images
Provides an interface to Gemini's image generation API
99 100 101 |
# File 'lib/llm/providers/google.rb', line 99 def images LLM::Google::Images.new(self) end |
#files ⇒ LLM::Google::Files
Provides an interface to Gemini's file management API
107 108 109 |
# File 'lib/llm/providers/google.rb', line 107 def files LLM::Google::Files.new(self) end |
#models ⇒ LLM::Google::Models
Provides an interface to Gemini's models API
115 116 117 |
# File 'lib/llm/providers/google.rb', line 115 def models LLM::Google::Models.new(self) end |
#default_model ⇒ String
Returns the default model for chat completions
123 124 125 |
# File 'lib/llm/providers/google.rb', line 123 def default_model "gemini-2.5-flash" end |
#server_tools ⇒ String => LLM::ServerTool
This method includes certain tools that require configuration through a set of options that are easier to set through the LLM::Provider#server_tool method.
134 135 136 137 138 139 140 |
# File 'lib/llm/providers/google.rb', line 134 def server_tools { google_search: server_tool(:google_search), code_execution: server_tool(:code_execution), url_context: server_tool(:url_context) } end |
#web_search(query:) ⇒ LLM::Response
A convenience method for performing a web search using the Google Search tool.
147 148 149 |
# File 'lib/llm/providers/google.rb', line 147 def web_search(query:) ResponseAdapter.adapt(complete(query, tools: [server_tools[:google_search]]), type: :web_search) end |
#user_role ⇒ Symbol
Returns the providers user role
154 155 156 |
# File 'lib/llm/providers/google.rb', line 154 def user_role :user end |
#system_role ⇒ Symbol
Returns the providers system role
161 162 163 |
# File 'lib/llm/providers/google.rb', line 161 def system_role :user end |
#developer_role ⇒ Symbol
Returns the providers developer role
168 169 170 |
# File 'lib/llm/providers/google.rb', line 168 def developer_role :user end |